Re: [R-sig-Geo] Mask from raster with less RAM

2020-08-14 Thread Frederico Faleiro
Hi guys,

thank you for your help. Sometimes we really need more RAM. I got access to
a cluster with more RAM and solved the problem.
Dear Sourav, you are right, but the crop only cuts the extent of the
rasters, so I really need to use mask. In addition, my main limitation was
in generating the polygon from raster (i.e. rasterToPolygons).

Cheers!

Em qua., 12 de ago. de 2020 às 15:20, Sourav Sarkar <
sourav.sar...@ahduni.edu.in> escreveu:

> Dear Frederico,
> With the very little experience that I have with raster data, it seems to
> me (maybe I am wrong) that `crop' is computationally easier than `mask'. If
> the crop command serves your purpose, maybe you can try:
> r2.mask <- crop(r2, pol)
>
> Kind regards,
>
> বুধ, ১২ আগস্ট, ২০২০ তারিখে ২:১৩ AM টায় এ Frederico Faleiro <
> fvfale...@gmail.com> লিখেছেন:
>
>> Dear all,
>>
>> I would like to generate a mask from a raster, but my workflow needs a lot
>> of RAM to process big rasters. I need this polygon mask to use in
>> another's
>> rasters.
>> Do you know another approach that needs less RAM?
>>
>> # reproducible example
>> library(raster)
>> # read data to create mask
>> r <- raster(system.file("external/test.grd", package="raster"))
>> r[!is.na(r)] <- 1
>> pol <- rasterToPolygons(r, dissolve = T) #  a lot of RAM to process the
>> data
>> # apply the mask in another raster
>>  r2 <- raster(extent(r), res(r))
>> r2[ ] <- 1
>> r2.mask <- mask(r2, pol)
>>
>> Cheers!
>>
>> [[alternative HTML version deleted]]
>>
>> _______
>> R-sig-Geo mailing list
>> R-sig-Geo@r-project.org
>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>>
>
>
> --
> Sourav Sarkar,
> Assistant Professor,
> School of Arts and Sciences,
> Ahmedabad University
>


-- 
Frederico Faleiro
Postdoctoral Researcher in the INCT-EECBio (https://www.eecbio.ufg.br/)
Department of Ecology | Institute of Biological Science | Federal
University of Goiás | Brazil
RG: https://www.researchgate.net/profile/Frederico_Faleiro
CV: http://lattes.cnpq.br/4926404840659003

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Mask from raster with less RAM

2020-08-12 Thread Frederico Faleiro
Hi guys,

I really need to use the mask without modifying the original resolution of
the rasters, so vectorizing was my first try. I know I can resample first
before applying a raster mask, but this will not work for my objectives
that involve creating some visualisations to the same area with the
original resolution.
Roger, I have tried use the google colab in the cloud (e.g.
https://colab.research.google.com/drive/1BYnnbqeyZAlYnxR9IHC8tpW07EpDeyKR#scrollTo=uaE0kZ0vkqms)
and I think you are right about the restriction about the use of external
softwares like GDAL in this specific case.
Do you know if is it possible to do it in GRASS or QGIS? I only find the
function polygonize (as in raster::rasterToContour) that generates contours
around equal values from the raster.

Cheers,


Em qua., 12 de ago. de 2020 às 13:45, Roger Bivand 
escreveu:

> On Wed, 12 Aug 2020, Frederico Faleiro wrote:
>
> > Hi guys, thank you for your reply.
> >
> > Jaime, I have tried, but I can't install rgdal needed to use raster and
> > apparently other people too (
> >
> https://stackoverflow.com/questions/57617895/how-to-install-rgdal-and-or-upload-raster-on-google-collaboration
> ).
> > Do you have a solution?
>
> Use CRAN Windows or MacOS binaries, or install the system requirements.
> Unless you can do that, for example because you do not control the
> platform you are using (are you working on a cloud instance?), do not
> install packages needing external software
> from source.
>
> > Hugo, I need the mask in vector format because the rasters have different
> > resolutions, so I can't use raster as a mask. I have modified the reprex
> to
> > be more precise about it (see below).
>
> Please use GRASS. What you are trying to do is something that has "just
> worked" in GRASS since it was first created. GRASS can be run from the
> shell, from Python and from R (rgrass7). Or use SAGA, another fast raster
> processor. Both mask from raster directly. Write a shell script for GRASS
> to resample your rasters, mask them, and complete. You can also use GRASS
> and SAGA from QGIS. You might use Python or R to make file name handling
> "easier" than in a script. Neither GRASS nor SAGA use much memory unless
> rasters are huge, and then they are lean.
>
> Roger
>
> > Steve, I think this approach has the same issue of Hugo Costa. I don't
> have
> > the polygon of the mask, so I am trying to create one to apply in the
> other
> > rasters of different resolutions.
> >
> > # reproducible example
> > library(raster)
> > # read data to create mask
> > r <- raster(system.file("external/test.grd", package="raster"))
> > r[!is.na(r)] <- 1
> > pol <- rasterToPolygons(r, dissolve = T) #  a lot of RAM to process the
> data
> > # raster of different resolution
> > res2 <- res(r) + 10
> > r2 <- raster(extent(r), resolution = res2)
> > r2[ ] <- 1
> > # apply the mask
> > r2.mask <- mask(r2, pol)
> > # plot
> > par(mfrow = c(1, 3))
> > plot(r)
> > plot(r2)
> > plot(r2.mask)
> >
> > Cheers!
> >
> > Em ter., 11 de ago. de 2020 às 19:26, Stephen Stewart <
> > stephen.stewar...@gmail.com> escreveu:
> >
> >> Hi Frederico,
> >>
> >> It may not solve all of your RAM issues, but in this situation I would
> >> skip the rasterToPolygons (which is also usually very slow) and use
> raster
> >> math to propagate NAs.
> >>
> >> r <- raster(system.file("external/test.grd", package="raster"))
> >> r[!is.na(r)] <- 1
> >> # Can also be faster to do r = r / r, but add an offset (that cannot
> >> result in 0) if you have valid 0s.
> >>  r2 <- raster(extent(r), res(r))
> >> r2[ ] <- 1
> >> r2.mask <- r * r2
> >>
> >> If you have a polygon to use as a mask, burn it in using the fasterize
> >> package and then apply the above.
> >>
> >> Hope that helps.
> >>
> >> Cheers,
> >>
> >> Steve
> >>
> >> On Wed., 12 Aug. 2020, 6:43 am Frederico Faleiro, 
> >> wrote:
> >>
> >>> Dear all,
> >>>
> >>> I would like to generate a mask from a raster, but my workflow needs a
> lot
> >>> of RAM to process big rasters. I need this polygon mask to use in
> >>> another's
> >>> rasters.
> >>> Do you know another approach that needs less RAM?
> >>>
> >>> # reproducible example
> >>> library(raster)

Re: [R-sig-Geo] Mask from raster with less RAM

2020-08-12 Thread Frederico Faleiro
Hi guys, thank you for your reply.

Jaime, I have tried, but I can't install rgdal needed to use raster and
apparently other people too (
https://stackoverflow.com/questions/57617895/how-to-install-rgdal-and-or-upload-raster-on-google-collaboration).
Do you have a solution?
Hugo, I need the mask in vector format because the rasters have different
resolutions, so I can't use raster as a mask. I have modified the reprex to
be more precise about it (see below).
Steve, I think this approach has the same issue of Hugo Costa. I don't have
the polygon of the mask, so I am trying to create one to apply in the other
rasters of different resolutions.

# reproducible example
library(raster)
# read data to create mask
r <- raster(system.file("external/test.grd", package="raster"))
r[!is.na(r)] <- 1
pol <- rasterToPolygons(r, dissolve = T) #  a lot of RAM to process the data
# raster of different resolution
res2 <- res(r) + 10
r2 <- raster(extent(r), resolution = res2)
r2[ ] <- 1
# apply the mask
r2.mask <- mask(r2, pol)
# plot
par(mfrow = c(1, 3))
plot(r)
plot(r2)
plot(r2.mask)

Cheers!

Em ter., 11 de ago. de 2020 às 19:26, Stephen Stewart <
stephen.stewar...@gmail.com> escreveu:

> Hi Frederico,
>
> It may not solve all of your RAM issues, but in this situation I would
> skip the rasterToPolygons (which is also usually very slow) and use raster
> math to propagate NAs.
>
> r <- raster(system.file("external/test.grd", package="raster"))
> r[!is.na(r)] <- 1
> # Can also be faster to do r = r / r, but add an offset (that cannot
> result in 0) if you have valid 0s.
>  r2 <- raster(extent(r), res(r))
> r2[ ] <- 1
> r2.mask <- r * r2
>
> If you have a polygon to use as a mask, burn it in using the fasterize
> package and then apply the above.
>
> Hope that helps.
>
> Cheers,
>
> Steve
>
> On Wed., 12 Aug. 2020, 6:43 am Frederico Faleiro, 
> wrote:
>
>> Dear all,
>>
>> I would like to generate a mask from a raster, but my workflow needs a lot
>> of RAM to process big rasters. I need this polygon mask to use in
>> another's
>> rasters.
>> Do you know another approach that needs less RAM?
>>
>> # reproducible example
>> library(raster)
>> # read data to create mask
>> r <- raster(system.file("external/test.grd", package="raster"))
>> r[!is.na(r)] <- 1
>> pol <- rasterToPolygons(r, dissolve = T) #  a lot of RAM to process the
>> data
>> # apply the mask in another raster
>>  r2 <- raster(extent(r), res(r))
>> r2[ ] <- 1
>> r2.mask <- mask(r2, pol)
>>
>> Cheers!
>>
>> --
>> Frederico Faleiro
>> Postdoctoral Researcher in the INCT-EECBio (https://www.eecbio.ufg.br/)
>> Department of Ecology | Institute of Biological Science | Federal
>> University of Goiás | Brazil
>> RG: https://www.researchgate.net/profile/Frederico_Faleiro
>> CV: http://lattes.cnpq.br/4926404840659003
>>
>> [[alternative HTML version deleted]]
>>
>> ___
>> R-sig-Geo mailing list
>> R-sig-Geo@r-project.org
>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>>
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Comparing 2 rasterbricks using functions from TSdist package

2020-05-28 Thread Frederico Faleiro
Hi Jackson,

you could use the complete.cases function to remove the pairs with NA
before run the distance function as bellow.

# CorDistance function does not handle NAs
x <- cbind(as.vector(s[[1]]), as.vector(s2[[1]]))
select.rows <- complete.cases(x)
CorDistance(x[select.rows, ] )

Cheers!
-- 
Frederico Faleiro
Postdoctoral Researcher in the INCT-EECBio (https://www.eecbio.ufg.br/)
Department of Ecology | Institute of Biological Science | Federal
University of Goiás | Brazil
RG: https://www.researchgate.net/profile/Frederico_Faleiro
CV: http://lattes.cnpq.br/4926404840659003

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Convert rectilinear to regular grid in R (stars and raster)

2020-03-24 Thread Frederico Faleiro
Hi Edzer,

thank you for your reply and modifications in the function.
Now the code is working fine, but the structure of the output was changed.
When I try extract the values converting to data.frame all values was
replaced by NA. However the plot is OK as you have tested.

1. How can I create a new grid with the new resolution in stars? Am I doing
it correctly (see bellow)?
2. How can I get the data.frame with the new values?
3. I have tried the function sf::st_interpolate_aw (see bellow), but there
is some warnings that I do not know the consequences for the calculation.

# update packages
library(devtools)
install_github("r-spatial/sf")
install_github("r-spatial/stars")

library(stars)

# read file
bc <- read_ncdf("pr_Amon_bcc-csm1-1_rcp45_r1i1p1_200601-209912.nc")
# create the new grid
grid_st <- st_as_stars(st_bbox(c(xmin=0,xmax=360,ymin=-90,ymax=90))) %>%
st_set_crs(4326)
# change resolution. Is it right?
attr(grid_st, "dimensions")[[1]]$delta <- 2.8125
attr(grid_st, "dimensions")[[2]]$delta <- - 2.8125

# change from rectilinear to regular grid with stars
bc_reg <- bc %>% st_warp(grid_st, method = "near")
# plot
plot(bc_reg[, , , 1:12])

# convert to data.frame
bc_reg_df <- as.data.frame(bc_reg_st)
# there is only NA in the variable
str(bc_reg_df)
range(bc_reg_df$pr)

# change from rectilinear to regular grid with sf
grid_sf <- st_as_sf(grid_st)
plot(grid_sf)
bc_sf <- st_as_sf(bc)
plot(bc_sf)
bc_sf_reg <- st_interpolate_aw(bc_sf, grid_sf, F)
#although coordinates are longitude/latitude, st_intersection assumes that
they are planar
#Warning message:
#In st_interpolate_aw.sf(bc_sf, grid_sf, F) :
#  st_interpolate_aw assumes attributes are constant over areas of x

Best regards,
Frederico Faleiro
Postdoctoral Researcher in the INCT-EECBio (https://www.eecbio.ufg.br/)
Federal University of Goiás | Brazil
RG: https://www.researchgate.net/profile/Frederico_Faleiro


Em ter., 24 de mar. de 2020 às 11:59, Edzer Pebesma <
edzer.pebe...@uni-muenster.de> escreveu:

> The stars part of this question has also been sent as an issue on
> github, and that is where it has been answered (partially solved):
>
> https://github.com/r-spatial/stars/issues/264
>
> please feel free to send follow-up's here, or there.
>
> On 3/23/20 6:45 PM, Frederico Faleiro wrote:
> > Dear all,
> >
> > I am trying convert netCDF data from rectilinear to regular grid in R,
> but
> > I am not sure what is the best approach for it.
> > I would like to convert the grid modifying the original resolution only
> > when necessary. In some cases I must change resolution to keep regular
> cell
> > size and cover all the world extent.
> > I have tried without success st_warp from stars and I find a way with
> > rasterize from raster package (see script bellow).
> >
> > I have the following questions:
> > 1. What am I doing wrong in stars package (see script bellow)?
> > 2. Is there a way to rasterize points to raster using metrics that
> consider
> > distance (e.g. bilinear, nearest neighbor, etc)? Note that I need
> > use points in raster package because it do not handle with rectilinear
> grid.
> > 3. Do you recommend any other approach?
> >
> > The data are available in:
> > GDrive:
> >
> https://drive.google.com/file/d/1HKxFpAwvY9k_cFbasagKHAwi1luzLv90/view?usp=sharing
> > CMIP5 portal (you must make a login):
> >
> http://aims3.llnl.gov/thredds/fileServer/cmip5_css01_data/cmip5/output1/BCC/bcc-csm1-1/rcp45/mon/atmos/Amon/r1i1p1/v20120705/pr/pr_Amon_bcc-csm1-1_rcp45_r1i1p1_200601-209912.nc
> >
> > # script
> > library(stars)
> > library(raster)
> >
> > # prepare data
> > bc <- read_ncdf("pr_Amon_bcc-csm1-1_rcp45_r1i1p1_200601-209912.nc")
> > # get coordinates and data from the first time period
> > bc1 <- as.data.frame(bc[, , , 1])
> > # convert longitude to variate between -180 and 180
> > bc1$lon <- ifelse(bc1$lon > 180, bc1$lon - 360, bc1$lon)
> >
> > # original resolution
> > bc1$lon %>% diff %>% table
> > bc1$lat %>% diff %>% table
> > # new resolution
> > new_res <- c(2.8125, 2.8125)
> > # Is it multiple of 180 or 360?
> > 180 %% new_res
> >
> > # create a new standard grid
> > wgs84 <- "+proj=longlat +ellps=WGS84 +towgs84=0,0,0,0,0,0,0 +no_defs"
> > grid_r <- raster(res = new_res, crs = wgs84)
> > grid_st <- st_as_stars(grid_r)
> >
> > # change from rectilinear to regular grid
> > # stars package: use st_warp to change from rectilinear to regular raster
> > with different methods
> > bc_reg <- bc %>% st_warp(grid_st, method = "aver

Re: [R-sig-Geo] Modifications of the methods in resample() function using raster package

2020-03-23 Thread Frederico Faleiro
Hi Alexandre,

you can use extract from raster (check the examples in the help page).
Depending of your data you can create a spatial polygon that cover the
number of cells you want and define the summary function or use the buffer
argument.

Best regards,

Frederico Faleiro
Postdoctoral Researcher in the INCT-EECBio (https://www.eecbio.ufg.br/)
Federal University of Goiás | Brazil
RG: https://www.researchgate.net/profile/Frederico_Faleiro



Em seg., 23 de mar. de 2020 às 10:41, ASANTOS via R-sig-Geo <
r-sig-geo@r-project.org> escreveu:

> Dear r-sig-geo members,
>
>  ??? I?ve like to know the possibility of make some modifications in
> resample() function in raster package. First, in "bilinear" method by
> default is assigns a weighted average of the four nearest cells, and
> I?ll like to change for five and six nearest cells, too, is possible?
> Second, is it possible too to create the mean method to calculate the
> arithmetic average of the four, five, and six nearest cells?
>
> Thanks in advanced,
> Alexandre
>
> --
> Alexandre dos Santos
> Geotechnologies and Spatial Statistics applied to Forest Entomology
> Instituto Federal de Mato Grosso (IFMT) - Campus Caceres
> Caixa Postal 244 (PO Box)
> Avenida dos Ramires, s/n - Vila Real
> Caceres - MT - CEP 78201-380 (ZIP code)
> Phone: (+55) 65 99686-6970 / (+55) 65 3221-2674
> Lattes CV: http://lattes.cnpq.br/1360403201088680
> OrcID: orcid.org/-0001-8232-6722
> ResearchGate: www.researchgate.net/profile/Alexandre_Santos10
> Publons: https://publons.com/researcher/3085587/alexandre-dos-santos/
> --
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[R-sig-Geo] Convert rectilinear to regular grid in R (stars and raster)

2020-03-23 Thread Frederico Faleiro
Dear all,

I am trying convert netCDF data from rectilinear to regular grid in R, but
I am not sure what is the best approach for it.
I would like to convert the grid modifying the original resolution only
when necessary. In some cases I must change resolution to keep regular cell
size and cover all the world extent.
I have tried without success st_warp from stars and I find a way with
rasterize from raster package (see script bellow).

I have the following questions:
1. What am I doing wrong in stars package (see script bellow)?
2. Is there a way to rasterize points to raster using metrics that consider
distance (e.g. bilinear, nearest neighbor, etc)? Note that I need
use points in raster package because it do not handle with rectilinear grid.
3. Do you recommend any other approach?

The data are available in:
GDrive:
https://drive.google.com/file/d/1HKxFpAwvY9k_cFbasagKHAwi1luzLv90/view?usp=sharing
CMIP5 portal (you must make a login):
http://aims3.llnl.gov/thredds/fileServer/cmip5_css01_data/cmip5/output1/BCC/bcc-csm1-1/rcp45/mon/atmos/Amon/r1i1p1/v20120705/pr/pr_Amon_bcc-csm1-1_rcp45_r1i1p1_200601-209912.nc

# script
library(stars)
library(raster)

# prepare data
bc <- read_ncdf("pr_Amon_bcc-csm1-1_rcp45_r1i1p1_200601-209912.nc")
# get coordinates and data from the first time period
bc1 <- as.data.frame(bc[, , , 1])
# convert longitude to variate between -180 and 180
bc1$lon <- ifelse(bc1$lon > 180, bc1$lon - 360, bc1$lon)

# original resolution
bc1$lon %>% diff %>% table
bc1$lat %>% diff %>% table
# new resolution
new_res <- c(2.8125, 2.8125)
# Is it multiple of 180 or 360?
180 %% new_res

# create a new standard grid
wgs84 <- "+proj=longlat +ellps=WGS84 +towgs84=0,0,0,0,0,0,0 +no_defs"
grid_r <- raster(res = new_res, crs = wgs84)
grid_st <- st_as_stars(grid_r)

# change from rectilinear to regular grid
# stars package: use st_warp to change from rectilinear to regular raster
with different methods
bc_reg <- bc %>% st_warp(grid_st, method = "average")
bc_reg <- bc %>% st_warp(grid_st, method = "near")
bc_reg <- bc %>% st_warp(grid_st, method = "bilinear")
# In all cases I have this error: Error in 1:prod(dims[dxy]) : NA/NaN
argument

# raster package: rasterize the points values to the new regular grid
bc1_sp <- bc1
coordinates(bc1_sp) <- ~ lon + lat
bc1_reg <- rasterize(bc1_sp, grid_r, field = "pr", fun = mean)

# plot the first time period
dev.new(title = "rectilinear")
plot(bc[, , , 1])
dev.new(title = "regular with raster package")
plot(bc1_mn_reg)

Best regards,

Frederico Faleiro
Postdoctoral Researcher in the INCT-EECBio (https://www.eecbio.ufg.br/)
Federal University of Goiás | Brazil
RG: https://www.researchgate.net/profile/Frederico_Faleiro

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Extract VIIRS data in R

2020-03-18 Thread Frederico Faleiro
Hi Miluji,

the R in general do not handle very well with netCDF. However I
recommend you use the netcdf4 package before try others.
This link is a good start:
http://geog.uoregon.edu/bartlein/courses/geog490/week04-netCDF.html#get-coordinate-including-time-variables
.
The coordinates of this file is not in longitude and latitude, it use
"number_of_lines" for X Axis and "number_of_pixels" for Y Axis. This way
the difference between the cells is always one and raster package
interprete the difference as one degree and return an warning about it.
I open the file in the software Panoply (design to work with netCDF)  and
everything is OK apparently.
Check bellow some exploration of the data in R.

library(raster)
library(ncdf4)
library(rgdal)

# netcdf4 package
netcdf_file <- "VNP02DNB_NRT.A2020069.1048.001.nc"
nc <- nc_open(netcdf_file)
print(nc)
# check your variable is structured as:
observation_data/DNB_observations[number_of_pixels,number_of_lines]
# variables available
names(nc[["var"]])
# properties from your variable
ncatt_get(nc, "observation_data/DNB_observations")
# get the variables values and check the structure
obs_matrix <- ncvar_get(nc, "observation_data/DNB_observations")
str(obs_matrix)

# raster package
nc_r <- brick(netcdf_file, lvar=0, values=TRUE,
varname="observation_data/DNB_observations")
# [1] "vobjtovarid4:  WARNING  I was asked to get a varid for
dimension named number_of_pixels BUT this dimension HAS NO DIMVAR! Code
will probably fail at this point"
# [1] "vobjtovarid4:  WARNING  I was asked to get a varid for
dimension named number_of_lines BUT this dimension HAS NO DIMVAR! Code will
probably fail at this point"

Best regards,

Frederico Faleiro
Postdoctoral Researcher in the INCT-EECBio (https://www.eecbio.ufg.br/)
Federal University of Goiás | Brazil
RG: https://www.researchgate.net/profile/Frederico_Faleiro

On Tue, Mar 17, 2020 at 11:39 AM Miluji Sb  wrote:

> Dear all,
>
> Hope everyone is keeping safe.
>
> I am trying to extract/read VIIRS nighttime lights data but the output
> seems rather strange.
>
> I have uploaded the file here
> <https://drive.google.com/open?id=1zpqXJ8AlEcnk6ApRb75tfQc4-Y8AfK5h> so as
> not exceed the size limit of the email.
>
> ## Code ##
> library(ncdf4)
> library(rgdal)
>
> netcdf_file <- c("VNP02DNB_NRT.A2020069.1048.001.nc")
> nl <- brick(netcdf_file, lvar=0, values=TRUE,
> varname="observation_data/DNB_observations")
>
> ## Detail ##
> class  : RasterLayer
> dimensions : 3232, 4064, 13134848  (nrow, ncol, ncell)
> resolution : 1, 1  (x, y)
> extent : 0.5, 4064.5, 0.5, 3232.5  (xmin, xmax, ymin, ymax)
> crs: NA
> source : C:/Users/shour/Desktop/VNP02DNB_NRT.A2020069.1048.001.nc
> names  : DNB.observations.at.pixel.locations
> zvar   : observation_data/DNB_observations
>
> The spatial resolution is supposed to be 750m but this shows 1°. What am I
> doing wrong? Any help will be greatly appreciated. Thank you.
>
> Sincerely,
>
> Millu
>
> [[alternative HTML version deleted]]
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] raster: stackApply problems..

2019-11-19 Thread Frederico Faleiro
Hi Leonidas,

both results are in the same order, but the name is different.
You can rename the first as in the second:
names(res) <- names(res2)

I provided an example to help you understand the logic.

library(raster)
beginCluster(2)
r <- raster()
values(r) <- 1
# simple sequential stack from 1 to 6 in all cells
s <- stack(r, r*2, r*3, r*4, r*5, r*6)
s
res <- clusterR(s, stackApply, args = list(indices=c(2,2,3,3,1,1), fun =
mean))
res
res2 <- stackApply(s, c(2,2,3,3,1,1), mean)
res2
dif <- res - res2
# exatly the same order because the difference is zero for all layers
dif
# rename
names(res) <- names(res2)

Best regards,

Frederico Faleiro

On Tue, Nov 19, 2019 at 4:15 PM Leonidas Liakos via R-sig-Geo <
r-sig-geo@r-project.org> wrote:

> I run the example with clusterR:
>
> no_cores <- parallel::detectCores() -1
> raster::beginCluster(no_cores)
> ?? res <- raster::clusterR(inp, raster::stackApply, args =
> list(indices=c(2,2,3,3,1,1),fun = mean))
> raster::endCluster()
>
> And the result is:
>
> > res
> class?? : RasterBrick
> dimensions : 180, 360, 64800, 3?? (nrow, ncol, ncell, nlayers)
> resolution : 1, 1?? (x, y)
> extent : -180, 180, -90, 90?? (xmin, xmax, ymin, ymax)
> crs?? : +proj=longlat +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0
> source : memory
> names?? : layer.1, layer.2, layer.3
> min values : 1.5, 3.5, 5.5
> max values : 1.5, 3.5, 5.5??
>
>
> layer.1, layer.2, layer.3 (?)
>
> So what corrensponds to what?
>
>
> If I run:
>
> res2 <- stackApply(inp,c(2,2,3,3,1,1),mean)
>
> The result is:
>
> > res2
> class  : RasterBrick
> dimensions : 180, 360, 64800, 3  (nrow, ncol, ncell, nlayers)
> resolution : 1, 1  (x, y)
> extent : -180, 180, -90, 90  (xmin, xmax, ymin, ymax)
> crs: +proj=longlat +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0
> source : memory
> names  : index_2, index_3, index_1
> min values : 1.5, 3.5, 5.5
> max values : 1.5, 3.5, 5.5
>
> There is no consistency with the names of the output and obscure
> correspondence with the indices in the case of clusterR
>
>
> [[alternative HTML version deleted]]
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Warning in fit.variogram: value out of range in 'bessel k' | automap and gstat packages

2019-09-16 Thread Frederico Faleiro
Hi Jon and others,

thank you. Apparently the gstat have the capacity to fit the parameters
only specifing the variogram model (
https://www.r-spatial.org/r/2016/02/14/gstat-variogram-fitting.html), but I
imagine there is a limitation to search for the correct parameters.
Then it is better when we provide some initial values for help in this
process as the autofitVariogram perform.

Best regards,

Frederico

On Mon, Sep 16, 2019 at 6:11 AM  wrote:

> Hi Frederico,
>
>
> gstat does not have different behaviour, because autofitVariogram is using
> fit.variogram, after some preprocessing. You get the convergency problems
> with fit.variogram because you don't supply start values for the variogram,
> which is done in the preprocessing of autofitVariogram.
>
>
> Cheers,
>
> Jon
>
>
> --
> Jon Olav Skøien
> European Commission
> Joint Research Centre – JRC.E.1
> Disaster Risk Management Unit
> Building 26b 1/144 | Via E.Fermi 2749, I-21027 Ispra (VA) Italy, TP 267
> jon.sko...@ec.europa.eu 
> <https://remi.webmail.ec.europa.eu/owa/redir.aspx?C=O12RUARdbvGA3WF3zGoSV0j5xMoZlQcIEwiS4Y9G8jzXRqCCC1HUCA..=mailto%3ajon.skoien%40jrc.ec.europa.eu>
>  Tel:  +39 0332 789205 Disclaimer: Views expressed in this email are those of 
> the individual and do not necessarily represent official views of the 
> European Commission.
>
>
> --
> *From:* Frederico Faleiro 
> *Sent:* 13 September 2019 23:16:22
> *To:* SKOIEN Jon (JRC-ISPRA)
> *Cc:* RsigGeo
> *Subject:* Re: [R-sig-Geo] Warning in fit.variogram: value out of range
> in 'bessel k' | automap and gstat packages
>
> Hi Jon and others,
>
> thank you for your help. The problem is really in the Ste model (Matern,
> M. Stein's parameterization). The warning only happens
> when I execute this model. However in the gstat with the same parameters
> for lambda have problem of convergency. Do you think the gstat have
> different bevavior to the same issue?
>
> av <- autofitVariogram(jan~1, pr, model = "Ste", verbose = T)
> # In fit.variogram(experimental_variogram, model = vgm(psill = psill,
>  ... :  value out of range in 'bessel_k'
>
> # define the same kappa values of automap
> seq.k <- c(0.05, seq(0.2, 2, 0.1), 5, 10)
> ve <- variogram(jan~1, pr)
> v <- fit.variogram(v, vgm("Ste"), fit.kappa = seq.k)
> Warning messages:
> 1: In fit.variogram(object, model, fit.sills = fit.sills, fit.ranges =
> fit.ranges,  :
>   No convergence after 200 iterations: try different initial values?
> 2: In fit.variogram(object, model, fit.sills = fit.sills, fit.ranges =
> fit.ranges,  :
>   No convergence after 200 iterations: try different initial values?
> 3: In fit.variogram(o, m, fit.kappa = FALSE, fit.method = fit.method,  :
>   No convergence after 200 iterations: try different initial values?
>
>
> Best regards,
>
> Frederico
>
> On Wed, Sep 11, 2019 at 7:02 AM  wrote:
>
>>
>> Frederico,
>>
>>
>> I don't think you need to worry about this warning in this case.
>> autofitVariogram tests a lot of variogram models (and different
>> kappa-values) in the search for the best one, and then fit.variogram tries
>> to optimize the parameters based on these. It seems the warnings were
>> generated in the C-code of gstat, based on some of the tested kappa-values
>> for the Matern variogram (Stein implementation). It also seems that these
>> kappa values did not give good fits to the sample variogram, so you did not
>> get the warning from the best fit variogram in this case.
>>
>>
>> Hope this helps,
>>
>> Jon
>>
>>
>>
>> --
>> Jon Olav Skøien
>> European Commission
>> Joint Research Centre – JRC.E.1
>> Disaster Risk Management Unit
>> Building 26b 1/144 | Via E.Fermi 2749, I-21027 Ispra (VA) Italy, TP 267
>> jon.sko...@ec.europa.eu 
>> <https://remi.webmail.ec.europa.eu/owa/redir.aspx?C=O12RUARdbvGA3WF3zGoSV0j5xMoZlQcIEwiS4Y9G8jzXRqCCC1HUCA..=mailto%3ajon.skoien%40jrc.ec.europa.eu>
>>  Tel:  +39 0332 789205 Disclaimer: Views expressed in this email are those 
>> of the individual and do not necessarily represent official views of the 
>> European Commission.
>>
>>
>> --
>> *From:* R-sig-Geo  on behalf of
>> Frederico Faleiro 
>> *Sent:* 09 September 2019 21:28:32
>> *To:* RsigGeo
>> *Subject:* [R-sig-Geo] Warning in fit.variogram: value out of range in
>> 'bessel k' | automap and gstat packages
>>
>> Dear list,
>>
>> I am trying make an interpolation with the function autoKrige from the
>> package automap, but I received more

Re: [R-sig-Geo] Warning in fit.variogram: value out of range in 'bessel k' | automap and gstat packages

2019-09-13 Thread Frederico Faleiro
Hi Jon and others,

thank you for your help. The problem is really in the Ste model (Matern, M.
Stein's parameterization). The warning only happens
when I execute this model. However in the gstat with the same parameters
for lambda have problem of convergency. Do you think the gstat have
different bevavior to the same issue?

av <- autofitVariogram(jan~1, pr, model = "Ste", verbose = T)
# In fit.variogram(experimental_variogram, model = vgm(psill = psill,  ...
:  value out of range in 'bessel_k'

# define the same kappa values of automap
seq.k <- c(0.05, seq(0.2, 2, 0.1), 5, 10)
ve <- variogram(jan~1, pr)
v <- fit.variogram(v, vgm("Ste"), fit.kappa = seq.k)
Warning messages:
1: In fit.variogram(object, model, fit.sills = fit.sills, fit.ranges =
fit.ranges,  :
  No convergence after 200 iterations: try different initial values?
2: In fit.variogram(object, model, fit.sills = fit.sills, fit.ranges =
fit.ranges,  :
  No convergence after 200 iterations: try different initial values?
3: In fit.variogram(o, m, fit.kappa = FALSE, fit.method = fit.method,  :
  No convergence after 200 iterations: try different initial values?


Best regards,

Frederico

On Wed, Sep 11, 2019 at 7:02 AM  wrote:

>
> Frederico,
>
>
> I don't think you need to worry about this warning in this case.
> autofitVariogram tests a lot of variogram models (and different
> kappa-values) in the search for the best one, and then fit.variogram tries
> to optimize the parameters based on these. It seems the warnings were
> generated in the C-code of gstat, based on some of the tested kappa-values
> for the Matern variogram (Stein implementation). It also seems that these
> kappa values did not give good fits to the sample variogram, so you did not
> get the warning from the best fit variogram in this case.
>
>
> Hope this helps,
>
> Jon
>
>
>
> --
> Jon Olav Skøien
> European Commission
> Joint Research Centre – JRC.E.1
> Disaster Risk Management Unit
> Building 26b 1/144 | Via E.Fermi 2749, I-21027 Ispra (VA) Italy, TP 267
> jon.sko...@ec.europa.eu 
> <https://remi.webmail.ec.europa.eu/owa/redir.aspx?C=O12RUARdbvGA3WF3zGoSV0j5xMoZlQcIEwiS4Y9G8jzXRqCCC1HUCA..=mailto%3ajon.skoien%40jrc.ec.europa.eu>
>  Tel:  +39 0332 789205 Disclaimer: Views expressed in this email are those of 
> the individual and do not necessarily represent official views of the 
> European Commission.
>
>
> --
> *From:* R-sig-Geo  on behalf of
> Frederico Faleiro 
> *Sent:* 09 September 2019 21:28:32
> *To:* RsigGeo
> *Subject:* [R-sig-Geo] Warning in fit.variogram: value out of range in
> 'bessel k' | automap and gstat packages
>
> Dear list,
>
> I am trying make an interpolation with the function autoKrige from the
> package automap, but I received more than 50 times the following warning
> message:
>
> In fit.variogram(experimental variogram, model = vgm(psill = psill, ... :
> value out of range in 'bessel k'
>
> I have tryed identify it in the code of fit.variogram and vgm from the
> gstat package, but they do not have any warning about it in the code. Then
> I imagine the warning is from another function.
> I fitted the variogram automatically in the automap and after I fitted in
> the gstat to check if there is the same warning for that parameters, but in
> the last I do not have any warning. When I check the variogram model, it
> has a good fit apparently. I provided the reproductible example below.
>
> Do you know the consequences of this warning and how can I avoid it? I
> would like to use the automap package because I need interpolate many other
> files like this.
>
> # example
> library(automap)
> library(sp)
>
> # download the data in:
>
> https://urldefense.proofpoint.com/v2/url?u=https-3A__drive.google.com_file_d_1L33-5FQgpNMdwvWff-5FuMrwhl-5FKCajrtlVe_view-3Fusp-3Dsharing=DwICAg=8NwulVB6ucrjuSGiwL_ckQ=DA6jV5X00Z1YpyMh4OS79xeSQGErBqY83CBz841DNnU=YlpT3yKlbNVLvXMp08O8X1LkniHGXTzJeLDHLTotwOI=U2z98vHb1mpKhKJ9UXA0RUrPi6r6IXibq1cuuyB0jWY=
> # read the data
> pr <- read.table('pr_monC.txt', header = TRUE, sep = "\t")
> coordinates(pr) <- ~long+lat
> # read the grid
> gr <- read.table('grid.txt' , header = TRUE, sep = '\t')
> gridded(gr) <-  ~long+lat
>
> # automatic fit variogram with automap
> av <- autofitVariogram(jan~1, pr)
> # warnings message:   In fit.variogram(experimental variogram, model =
> vgm(psill = psill, ... : value out of range in 'bessel k'
> plot(av) # get in the graph the parameters to fit the variogram
>
> # fit variogram in gstat with the model parameters from autofitVariogram to
> check if there is any warning message too
> ve <- variogram(jan~1, pr)
> v <- fit.variogram(ve, vgm(psill = 9723, m

[R-sig-Geo] Warning in fit.variogram: value out of range in 'bessel k' | automap and gstat packages

2019-09-09 Thread Frederico Faleiro
Dear list,

I am trying make an interpolation with the function autoKrige from the
package automap, but I received more than 50 times the following warning
message:

In fit.variogram(experimental variogram, model = vgm(psill = psill, ... :
value out of range in 'bessel k'

I have tryed identify it in the code of fit.variogram and vgm from the
gstat package, but they do not have any warning about it in the code. Then
I imagine the warning is from another function.
I fitted the variogram automatically in the automap and after I fitted in
the gstat to check if there is the same warning for that parameters, but in
the last I do not have any warning. When I check the variogram model, it
has a good fit apparently. I provided the reproductible example below.

Do you know the consequences of this warning and how can I avoid it? I
would like to use the automap package because I need interpolate many other
files like this.

# example
library(automap)
library(sp)

# download the data in:
https://drive.google.com/file/d/1L33_QgpNMdwvWff_uMrwhl_KCajrtlVe/view?usp=sharing
# read the data
pr <- read.table('pr_monC.txt', header = TRUE, sep = "\t")
coordinates(pr) <- ~long+lat
# read the grid
gr <- read.table('grid.txt' , header = TRUE, sep = '\t')
gridded(gr) <-  ~long+lat

# automatic fit variogram with automap
av <- autofitVariogram(jan~1, pr)
# warnings message:   In fit.variogram(experimental variogram, model =
vgm(psill = psill, ... : value out of range in 'bessel k'
plot(av) # get in the graph the parameters to fit the variogram

# fit variogram in gstat with the model parameters from autofitVariogram to
check if there is any warning message too
ve <- variogram(jan~1, pr)
v <- fit.variogram(ve, vgm(psill = 9723, model = "Ste", range = 20, nugget
= 194, kappa = 0.8))
# it does not produce any warning
plot(ve, v)

Best regards,

Frederico

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Error in netCDF file: cells are not equally spaced

2019-08-22 Thread Frederico Faleiro
I agree with you Roy. The problem is that the ncdf4 package is focused in
read and write netCDF files, but apparently it is difficult "communicate"
with the other packages.

Could you share an example code of your approach described in the last
paragraph?

Best regards,

Frederico

On Thu, Aug 22, 2019 at 6:52 PM Roy Mendelssohn - NOAA Federal <
roy.mendelss...@noaa.gov> wrote:

> Hi  Frederico:
>
> There are two separate issues here from my point of view.  The first is
> are both files CF compliant,  and as far as I can tell they both are, and
> applications like Panoply or ncdf4 that understand CF compliant files read
> them just fine.
>
> The second issue is why can't raster or stars or gdal read them.  That is
> internal to those codes,  and people who know those packages better can
> probably answer that. And while I may appear to be beating a dead horse,
> for better or worse CF compliant netcdf-3 or netcdf-4 files (and this
> includes files that follow the Discrete Sampling Geometry guidelines) are
> what all of the major satellite data centers,  the major climate and
> oceanographic and meteorological data centers, and most modeling efforts
> (such as ROMS and IPCC) store their output.   These are standards in these
> communities,  and it would really benefit the R community if packages aimed
> at these types of data make certain they can read compliant datasets (of
> course this is easy for me to say because I won't be doing the work!).
> Also the Python programs I have also read these files just fine, but that
> is probably because the Python codes have been developed more internal to
> the particular community around netcdf files and CF conventions.
>
> I have found I get better results if I first read the file using ncdf4,
> and then if I want to use the features of raster (or some other related
> program), I use one of the raster commands  to convert arrays to raster
> objects.
>
> HTH,
>
> -Roy
>
>
>
> > On Aug 22, 2019, at 2:32 PM, Frederico Faleiro 
> wrote:
> >
> > Dear list,
> >
> > I do not understand why only this model have this issue. I can open many
> > other files from other models without any issue in raster package.
> >
> > *Roy*, I test with different model (i.e GFDL-CM3) from NOAA (
> >
> https://drive.google.com/file/d/1GrfvbRSH_FpDmlRrqNGXqn6Tz13fjhB6/view?usp=sharing
> )
> > without any problem as you can check in the code below. I find the same
> > issue pointed by *Edzer* in the GFDL-ESM2G, but not in the GFDL-CM3.
> >
> > Maybe the others can test with this file to figure out what is the
> problem
> > with first one.
> >
> > If the problem is only the difference in the interval of the cells, Is
> > there any problem to use the raster in this case?
> > The ncdf4 package can read it better, but I can not do other simple
> > analysis like cell average, for example. Is it possible read it and make
> > some kind of transformation to better use it in raster package?
> >
> > # example
> > library(ncdf4)
> > library(raster)
> > # open with ncdf4
> > x.n <- nc_open('pr_Amon_GFDL-ESM2G_rcp45_r1i1p1_204101-204512.nc')
> > z.n <- nc_open('pr_Amon_GFDL-CM3_rcp45_r1i1p1_204101-204512.nc')
> > # open with raster
> > x.r <- brick('pr_Amon_GFDL-ESM2G_rcp45_r1i1p1_204101-204512.nc',
> > stopIfNotEqualSpaced = F)
> > z.r <- brick('pr_Amon_GFDL-CM3_rcp45_r1i1p1_204101-204512.nc')
> >
> > # get coordenates
> > lat.x <- ncvar_get(x.n,"lat")
> > lon.x <- ncvar_get(x.n,"lon")
> > coords.xn <- as.matrix(expand.grid(lon.x,lat.x))
> > coords.xr <- coordinates(x.r) # different order of the xy in netcdf4
> > lat.z <- ncvar_get(z.n,"lat")
> > lon.z <- ncvar_get(z.n,"lon")
> > coords.zn <- as.matrix(expand.grid(lon.z,lat.z))
> > coords.zr <- coordinates(z.r) # different order of the zz in netcdf4
> >
> > # check the differences between the coordenates
> > diffLatLon <- function(x) {
> >  n <- length(x)-1
> >  v <- rep(NA, n)
> >  for(i in 1:n) {
> >v[i] <- x[i] - x[i+1]
> >  }
> > return(v)
> > }
> >
> > diffLatLon(lat.x)  # in this file only the first and last intervals
> differ
> > from the rest.
> > diffLatLon(lon.x) # the same interval in all longitudes
> > diffLatLon(lat.z)  # the same interval in all latitudes
> > diffLatLon(lon.z) # the same interval in all longitudes
> >
> > # plot the average of the layers in raster
> > plot(mean(x.r))
> > plot(mean(z.r))
> >
> > Best regards,
> >
> > Freder

Re: [R-sig-Geo] Error in netCDF file: cells are not equally spaced

2019-08-22 Thread Frederico Faleiro
Dear list,

I do not understand why only this model have this issue. I can open many
other files from other models without any issue in raster package.

*Roy*, I test with different model (i.e GFDL-CM3) from NOAA (
https://drive.google.com/file/d/1GrfvbRSH_FpDmlRrqNGXqn6Tz13fjhB6/view?usp=sharing)
without any problem as you can check in the code below. I find the same
issue pointed by *Edzer* in the GFDL-ESM2G, but not in the GFDL-CM3.

Maybe the others can test with this file to figure out what is the problem
with first one.

If the problem is only the difference in the interval of the cells, Is
there any problem to use the raster in this case?
The ncdf4 package can read it better, but I can not do other simple
analysis like cell average, for example. Is it possible read it and make
some kind of transformation to better use it in raster package?

# example
library(ncdf4)
library(raster)
# open with ncdf4
x.n <- nc_open('pr_Amon_GFDL-ESM2G_rcp45_r1i1p1_204101-204512.nc')
z.n <- nc_open('pr_Amon_GFDL-CM3_rcp45_r1i1p1_204101-204512.nc')
# open with raster
x.r <- brick('pr_Amon_GFDL-ESM2G_rcp45_r1i1p1_204101-204512.nc',
stopIfNotEqualSpaced = F)
z.r <- brick('pr_Amon_GFDL-CM3_rcp45_r1i1p1_204101-204512.nc')

# get coordenates
lat.x <- ncvar_get(x.n,"lat")
lon.x <- ncvar_get(x.n,"lon")
coords.xn <- as.matrix(expand.grid(lon.x,lat.x))
coords.xr <- coordinates(x.r) # different order of the xy in netcdf4
lat.z <- ncvar_get(z.n,"lat")
lon.z <- ncvar_get(z.n,"lon")
coords.zn <- as.matrix(expand.grid(lon.z,lat.z))
coords.zr <- coordinates(z.r) # different order of the zz in netcdf4

# check the differences between the coordenates
diffLatLon <- function(x) {
  n <- length(x)-1
  v <- rep(NA, n)
  for(i in 1:n) {
v[i] <- x[i] - x[i+1]
  }
return(v)
}

diffLatLon(lat.x)  # in this file only the first and last intervals differ
from the rest.
diffLatLon(lon.x) # the same interval in all longitudes
diffLatLon(lat.z)  # the same interval in all latitudes
diffLatLon(lon.z) # the same interval in all longitudes

# plot the average of the layers in raster
plot(mean(x.r))
plot(mean(z.r))

Best regards,

Frederico

On Thu, Aug 22, 2019 at 1:29 PM Edzer Pebesma 
wrote:

> Thanks!
>
> stars::read_stars (GDAL) won't read coordinates or coordinate reference
> system from this file, confirmed by running gdalinfo on it.
>
> stars::read_ncdf (installing stars from github) reads it like this:
>
> library(stars)
> # Loading required package: abind
> # Loading required package: sf
> # Linking to GEOS 3.7.1, GDAL 2.4.2, PROJ 5.2.0
> (r0 = read_ncdf("pr_Amon_GFDL-ESM2G_rcp45_r1i1p1_204101-204512.nc"))
> # no 'var' specified, using pr
> # other available variables:
> #  average_DT, average_T1, average_T2, lat, lon, bnds, time, time_bnds,
> lat_bnds, lon_bnds
> # No projection information found in nc file.
> #  Coordinate variable units found to be degrees,
> #  assuming WGS84 Lat/Lon.
> # stars object with 3 dimensions and 1 attribute
> # attribute(s):
> #  pr [kg/m^2/s]
> #  Min.   :0.000e+00
> #  1st Qu.:6.035e-06
> #  Median :1.700e-05
> #  Mean   :2.799e-05
> #  3rd Qu.:3.575e-05
> #  Max.   :1.068e-03
> # dimension(s):
> #  from  to offset delta   refsys point
> # lon 1 144 NANA +proj=longlat +datum=WGS8...NA
> # lat 1  90 NANA +proj=longlat +datum=WGS8... FALSE
> # time1  60 NANAPCICt FALSE
> #   values
> # lon  [0,2.5),...,[357.5,360) [x]
> # lat[-90,-88.98876),...,[88.98876,90) [y]
> # time [2041-01-01,2041-02-01),...,[2045-12-01,2046-01-01)
> st_get_dimension_values(r0, "lat") %>% diff %>% table
> # .
> # 1.51685393258427 2.02247191011234 2.02247191011236 2.02247191011237
> #21   74   12
>
> where essentially only the first and last intervals differ from the rest.
>
> So, yes, latitudes are not regular. Also, time is PCICt, and uses a 365
> day calendar without leap years:
>
> st_get_dimension_values(r0, "time") %>% diff %>% table
> # .
> # 28 30 31
> #  5 20 34
>
>
>
>
>
> On 8/22/19 8:55 PM, Frederico Faleiro wrote:
> > Dear list,
> > sorry about the file. The files from the CMIP5 are public, but need a
> > registration first.
> > You can access the same file in any of these links:
> > https://www.sendspace.com/file/famgz3 and
> >
> https://drive.google.com/file/d/1L1T03iQ6LU1yYRYeRypxwMB3E7fLjkJI/view?usp=sharing
> > .
> >
> > In this meantime I will try the Mike' advices.
> >
> > Best regards,
> >
> > 

[R-sig-Geo] Error in netCDF file: cells are not equally spaced

2019-08-21 Thread Frederico Faleiro
Dear list,

I am using some  netCDF files from the CMIP5 climate models in raster
package, but I am having some issues with one model.
The netCDF file from the GFDL-ESM2G model (e.g.
http://aims3.llnl.gov/thredds/fileServer/css03_data/cmip5/output1/NOAA-GFDL/GFDL-ESM2G/rcp45/mon/atmos/Amon/r1i1p1/v20120412/pr/pr_Amon_GFDL-ESM2G_rcp45_r1i1p1_204101-204512.nc
)
have the error message as in bellow example.

#Example
library(raster)
s1 <- stack("pr_Amon_GFDL-ESM2G_rcp45_r1i1p1_204101-204512.nc")
Error in .rasterObjectFromCDF(x, type = objecttype, band = band, ...) :
  cells are not equally spaced; you should extract values as points

# I check some solutions that recomend force read the file with the
argument "stopIfNotEqualSpaced = F" as bellow.
sf <- stack("pr_Amon_GFDL-ESM2G_rcp45_r1i1p1_204101-204512.nc",
*stopIfNotEqualSpaced
= F*)
bf <- brick("pr_Amon_GFDL-ESM2G_rcp45_r1i1p1_204101-204512.nc",
*stopIfNotEqualSpaced
= F*)

I performed some tests and only "works" with brick. However I did not find
any solution to check where is the problem and fix it in the file.

How can I check if it is really an issue and fix it?

sessionInfo()
R version 3.5.1 (2018-07-02)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 10 x64 (build 18362)

Matrix products: default

locale:
[1] LC_COLLATE=Portuguese_Brazil.1252  LC_CTYPE=Portuguese_Brazil.1252
[3] LC_MONETARY=Portuguese_Brazil.1252 LC_NUMERIC=C
[5] LC_TIME=Portuguese_Brazil.1252

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base

other attached packages:
[1] raster_2.8-19 sp_1.3-1

loaded via a namespace (and not attached):
[1] compiler_3.5.1   rgdal_1.4-3  Rcpp_1.0.1   codetools_0.2-15
ncdf4_1.16.1
[6] grid_3.5.1   lattice_0.20-35

Best regards,

Frederico

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo