Re: [gdal-dev] Call for discussion on RFC 59 (v2): GDAL/OGR utilities as a library

2015-09-01 Thread Even Rouault
Folks,

I'd wish we can find a solution that satisfy all parties. Here's another 
iteration for a possible proposition.

* C API:

Principles :
- dataset objects can be passed
- the option structure is opaque (should address Frank's concern about exposing 
too much internal stuff)
- the option structure is parsed from an array of strings (with the current 
syntax of utilities)
- a few setters can be added, for example, to set a progress function. Or 
possibly auxiliary objects,
  like the cutline layer for gdalwarp.
- not sure if we need the *pbUsageError flag. I removed it. If there's a 
conflict of options only found after
  parsing time (eg due to characteristics of the datasets), then a CPLError() 
message should be enough.

Example with GDALTranslate:

char** papszArgv = CSLParseCommandLine(const char* pszCommandLine);
GDALTranslateOptionsParse* psOptions = GDALTranslateOptionsNew(char** 
papszArgv);
GDALTranslateOptionsSetProgress(options, pfnProgress, pProgressData);
hOutDS = GDALTranslate(const char *pszDest, GDALDatasetH hSrcDataset, 
GDALTranslateOptions *psOptions)
GDALTranslateOptionsDestroy(GDALTranslateOptionsParse* psOptions);
CSLDestroy(papszArgV);


* Python API

Principles :
- use the above C API from SWIG, but mostly for internal use of the upper 
levels explained below that
  would be only Python code (no SWIG).
- pubic API offers access to (hopefully nicely named) dedicated arguments and 
builds the string from them
- public API offers access to the string representation as well
- use Python dynamic typing to offer sugar candy, e.g pass a SRS either as an 
osr.SpatialReference()
object or a string (the object being serialized to string, but this is a 
lossless operation)


Examples:

1) Use case with repeatable options
options = gdal.TranslateOptions()
options.bands = [ 1, 2, 3 ]
options.format = 'MEM'
options.progress = my_progress_method
mem_ds = gdal.Translate('', src_ds, options = options)

2)  Variant of 1). With some Python magic on **kwargs it can be automated to 
redirect on 1)
mem_ds = gdal.Translate('', src_ds, bands = [1,2,3], format = 'MEM', progress = 
my_progress_method)

3) String oriented.
options = gdal.TranslateOptions('-b 1 -b 2 -b 3 -of MEM')
mem_ds = gdal.Translate('', src_ds, options = options, progress = 
my_progress_method)

4) Variant of 4)
mem_ds = gdal.Translate('', src_ds, options = '-b 1 -b 2 -b 3 -of MEM', 
progress = my_progress_method)

5) For the nostalgics, a wrapper of the above :
gdal.Translate('in.tif out.tif -b 1 -b 2 -b 3 -of MEM', progress = 
my_progress_method')

That's maybe too many different possibilities, although some build upon others, 
so not necessarily
a lot of code involved (easier to say when it is not coded ahah!)

* Other binding languages.

Would probably only fallback on wrapping C API with SWIG in a first step.
And find most appropriate solutions for each language.

- Java for sure doesn't have a keyword approach for method arguments
(well with what I remember from Java 1.6. might have changed). Builder approach
would be a possibility 
(http://stackoverflow.com/questions/1988016/named-parameter-idiom-in-java) 
- C#: being a clone of Java, probably not.
- Perl: apparently possible


Opinions ?

Even


> Hi Frank,
> 
> I was one of the original people who argued against the "array of strings"
> approach...
> 
> On 27 August 2015 at 02:26, Frank Warmerdam  wrote:
> > I clearly should have been commenting sooner.
> 
> Several months ago :p
> 
> > I am concerned that having messy structures of options for each
> > program is going to complicate maintaining the actually commandline
> > programs, and that it will still be a fragile and complicated point of
> > entry as commandline arguments evolve over time.
> 
> The commandline tools eventually become string-parsing and wrapping of the
> corresponding library tool - I'm not sure that makes it fragile &
> complicated? Means that the library-ified apps are *at least as*
> flexible/expressive/powerful
> as the command-line tools.
> 
> > I'd prefer if the approach had just been to embed the main()'s in a
> > library and to still pass the exact same vector of arguments (in the
> > char **argv format) to these functions instead of shelling out a
> > program.
> 
> That kinda defeats the whole point - a huge array of complex string-ified
> arguments is what we're all doing at the moment, wrapped in an subprocess
> call. Some options take multiple arguments in multiple strings, others take
> multiple arguments in single strings, it's massively confusing. And we all
> have big pipelines of chained gdalwarp/gdal_translate/etc code...
> 
> What we were striving for was to make it distinctly *better*:
> 
>- progress/logging/error handling
>- options that accept geometries or SRS or in-memory datasets without
>having to re-serialize them and/or utilise tempfiles
>- easily applying the same operations over multiple datasets
>- configuration option defaults
> 
> > I would love to b

[gdal-dev] SRS of the Elastic Search geometries?

2015-09-01 Thread Jukka Rahkonen
Hi,

Are the geometries which are stored into Elastic Search supposed to be in
EPSG:4326? Now I can see that the driver is writing EPSG:4326 definitions
into the metadata of the ES index but I can still save for example EPSG:3067
geometries without transforming them into EPSG:4326.

-Jukka Rahkonen-

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] SRS of the Elastic Search geometries?

2015-09-01 Thread Even Rouault
Hi Jukka,

> Are the geometries which are stored into Elastic Search supposed to be in
> EPSG:4326?

That's my understanding of ES doc. For geo_point mapping, you can put 
arbitrary coordinates if I remember, but I believe the behaviour is undefined.
So basically the driver is reprojecting on-the-fly to EPSG:4326 on feature 
insertion.

> Now I can see that the driver is writing EPSG:4326 definitions
> into the metadata of the ES index

What do you mean by "into the metadata of the ES index". The SRS is always 
implied.
Of perhaps you meant on-the-fly reprojection ?

> but I can still save for example
> EPSG:3067 geometries without transforming them into EPSG:4326.

On-the-fly reprojection should occur normally (provided that your source layer 
has explicit SRS)

Little demo:

$ cat simple.csv
id,WKT
1,"LINESTRING(2 49,3 50)"

$ ogr2ogr simple_EPSG_32631.shp simple.csv -t_srs EPSG:32631 -s_srs EPSG:4326

$ ogr2ogr -update es: simple_EPSG_32631.shp

$ ogrinfo es: simple_epsg_32631
INFO: Open of `es:'
  using driver `ElasticSearch' successful.

Layer name: simple_epsg_32631
Geometry: Line String
Feature Count: 1
Extent: (2.00, 49.00) - (3.00, 50.00)
Layer SRS WKT:
GEOGCS["WGS 84",
DATUM["WGS_1984",
SPHEROID["WGS 84",6378137,298.257223563,
AUTHORITY["EPSG","7030"]],
AUTHORITY["EPSG","6326"]],
PRIMEM["Greenwich",0,
AUTHORITY["EPSG","8901"]],
UNIT["degree",0.0174532925199433,
AUTHORITY["EPSG","9122"]],
AUTHORITY["EPSG","4326"]]
FID Column = ogc_fid
Geometry Column = geometry
_id: String (0.0)
WKT: String (0.0)
id: String (0.0)
OGRFeature(simple_epsg_32631):1
  _id (String) = AU-JSdC2eRJEoT8Pm33E
  WKT (String) = LINESTRING(2 49,3 50)
  id (String) = 1
  LINESTRING (2 49,3 50)


Even

-- 
Spatialys - Geospatial professional services
http://www.spatialys.com
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] hdf5 dataset. Header data too long. Truncated

2015-09-01 Thread Rashad M
Hello all,

I have an the following warning from gdal when reading .h5 dataset

Warning 2: Header data too long. Truncated.

Looking at frmts/hdf5/hdf5dataset.cpp there is a define for
MAX_METADATA_LEN

#define MAX_METADATA_LEN 32768


and there is a check
if( CPLStrlcat(szValue, szData, MAX_METADATA_LEN) >=

MAX_METADATA_LEN )
CPLError( CE_Warning, CPLE_OutOfMemory,
  "Header data too long. Truncated\n");

Now I could update MAX_METADATA_LEN but is there a way it can set be set at
runtime ?

I am using gdalinfo from gdal svn 1.11 branch
svn info  gives:

Path: .
Working Copy Root Path: /home/mkanavat/sources/gdal_branch_1.11
URL: http://svn.osgeo.org/gdal/branches/1.11/gdal
Relative URL: ^/branches/1.11/gdal
Repository Root: http://svn.osgeo.org/gdal
Repository UUID: f0d54148-0727-0410-94bb-9a71ac55c965
Revision: 30007
Node Kind: directory
Schedule: normal
Last Changed Author: rouault
Last Changed Rev: 30005
Last Changed Date: 2015-09-01 10:17:50 +0200 (Tue, 01 Sep 2015)



-- 
Regards,
   Rashad
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] hdf5 dataset. Header data too long. Truncated

2015-09-01 Thread Even Rouault
Le mardi 01 septembre 2015 16:50:32, Rashad M a écrit :
> Hello all,
> 
> I have an the following warning from gdal when reading .h5 dataset
> 
> Warning 2: Header data too long. Truncated.
> 
> Looking at frmts/hdf5/hdf5dataset.cpp there is a define for
> MAX_METADATA_LEN
> 
> #define MAX_METADATA_LEN 32768
> 
> 
> and there is a check
> if( CPLStrlcat(szValue, szData, MAX_METADATA_LEN) >=
> 
> MAX_METADATA_LEN )
> CPLError( CE_Warning, CPLE_OutOfMemory,
>   "Header data too long. Truncated\n");
> 

This was introduced in https://trac.osgeo.org/gdal/changeset/20438 to avoid 
crashes. Potentially it could be reworked to deal with arbitrary long 
metadata. Although it could be kind of annoying to have too long values as 
metadata item, especially in the main metadata domain. A cleaner solution 
would perhaps be have something like "MY_ITEM=" and you'd have the full value in 
the HDF5 metadata domain (perhaps not unlimited but with a much higher limit, 
let's say 1 MB per item). People with more experience with HDF5 datasets could 
perhaps comment.

> Now I could update MAX_METADATA_LEN but is there a way it can set be set at
> runtime ?

No, you need to recompile.

-- 
Spatialys - Geospatial professional services
http://www.spatialys.com
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Call for discussion on RFC 59 (v2): GDAL/OGR utilities as a library

2015-09-01 Thread Tim Keitt
http://www.keittlab.org/

On Tue, Sep 1, 2015 at 8:34 AM, Even Rouault 
wrote:

> Folks,
>
> I'd wish we can find a solution that satisfy all parties. Here's another
> iteration for a possible proposition.
>
> * C API:
>
> Principles :
> - dataset objects can be passed
> - the option structure is opaque (should address Frank's concern about
> exposing too much internal stuff)
> - the option structure is parsed from an array of strings (with the
> current syntax of utilities)
> - a few setters can be added, for example, to set a progress function. Or
> possibly auxiliary objects,
>   like the cutline layer for gdalwarp.
> - not sure if we need the *pbUsageError flag. I removed it. If there's a
> conflict of options only found after
>   parsing time (eg due to characteristics of the datasets), then a
> CPLError() message should be enough.
>
> Example with GDALTranslate:
>
> char** papszArgv = CSLParseCommandLine(const char* pszCommandLine);
> GDALTranslateOptionsParse* psOptions = GDALTranslateOptionsNew(char**
> papszArgv);
> GDALTranslateOptionsSetProgress(options, pfnProgress, pProgressData);
> hOutDS = GDALTranslate(const char *pszDest, GDALDatasetH hSrcDataset,
> GDALTranslateOptions *psOptions)
> GDALTranslateOptionsDestroy(GDALTranslateOptionsParse* psOptions);
> CSLDestroy(papszArgV);
>
>
> * Python API
>
> Principles :
> - use the above C API from SWIG, but mostly for internal use of the upper
> levels explained below that
>   would be only Python code (no SWIG).
> - pubic API offers access to (hopefully nicely named) dedicated arguments
> and builds the string from them
> - public API offers access to the string representation as well
> - use Python dynamic typing to offer sugar candy, e.g pass a SRS either as
> an osr.SpatialReference()
> object or a string (the object being serialized to string, but this is a
> lossless operation)
>
>
> Examples:
>
> 1) Use case with repeatable options
> options = gdal.TranslateOptions()
> options.bands = [ 1, 2, 3 ]
> options.format = 'MEM'
> options.progress = my_progress_method
> mem_ds = gdal.Translate('', src_ds, options = options)
>
> 2)  Variant of 1). With some Python magic on **kwargs it can be automated
> to redirect on 1)
> mem_ds = gdal.Translate('', src_ds, bands = [1,2,3], format = 'MEM',
> progress = my_progress_method)
>
> 3) String oriented.
> options = gdal.TranslateOptions('-b 1 -b 2 -b 3 -of MEM')
> mem_ds = gdal.Translate('', src_ds, options = options, progress =
> my_progress_method)
>
> 4) Variant of 4)
> mem_ds = gdal.Translate('', src_ds, options = '-b 1 -b 2 -b 3 -of MEM',
> progress = my_progress_method)
>
> 5) For the nostalgics, a wrapper of the above :
> gdal.Translate('in.tif out.tif -b 1 -b 2 -b 3 -of MEM', progress =
> my_progress_method')
>
> That's maybe too many different possibilities, although some build upon
> others, so not necessarily
> a lot of code involved (easier to say when it is not coded ahah!)
>
> * Other binding languages.
>
> Would probably only fallback on wrapping C API with SWIG in a first step.
> And find most appropriate solutions for each language.
>
> - Java for sure doesn't have a keyword approach for method arguments
> (well with what I remember from Java 1.6. might have changed). Builder
> approach
> would be a possibility (
> http://stackoverflow.com/questions/1988016/named-parameter-idiom-in-java)
> - C#: being a clone of Java, probably not.
> - Perl: apparently possible
>
>
> Opinions ?
>

In an ideal world, I would prefer a nice clean algorithms library that is
orthogonal to the command line and parsing. The utilities then simply
consist of parsing and calling this library. I would also prefer the
library to be broken down in to a set of orthogonal lower-level primitives
and the higher-level algorithms built from these. But I cannot contribute
significantly to that effort so I think its up to those than can to decide
the course.

THK


>
> Even
>
>
> > Hi Frank,
> >
> > I was one of the original people who argued against the "array of
> strings"
> > approach...
> >
> > On 27 August 2015 at 02:26, Frank Warmerdam  wrote:
> > > I clearly should have been commenting sooner.
> >
> > Several months ago :p
> >
> > > I am concerned that having messy structures of options for each
> > > program is going to complicate maintaining the actually commandline
> > > programs, and that it will still be a fragile and complicated point of
> > > entry as commandline arguments evolve over time.
> >
> > The commandline tools eventually become string-parsing and wrapping of
> the
> > corresponding library tool - I'm not sure that makes it fragile &
> > complicated? Means that the library-ified apps are *at least as*
> > flexible/expressive/powerful
> > as the command-line tools.
> >
> > > I'd prefer if the approach had just been to embed the main()'s in a
> > > library and to still pass the exact same vector of arguments (in the
> > > char **argv format) to these functions instead of shelling out a
> > > program.
> >
> >

Re: [gdal-dev] Call for discussion on RFC 59 (v2): GDAL/OGR utilities as a library

2015-09-01 Thread Even Rouault
> 
> In an ideal world, I would prefer a nice clean algorithms library that is
> orthogonal to the command line and parsing. The utilities then simply
> consist of parsing and calling this library. I would also prefer the
> library to be broken down in to a set of orthogonal lower-level primitives
> and the higher-level algorithms built from these.

Well, if I'd classify the content of what is in the apps/ source tree of GDAL, 
I'd say there are :

* Heavy wrappers with lots of options on top of existing API to accomodate for 
various workflows :
- gdal_translate: wrapper of GDALCreateCopy() and the VRT API
- gdalwarp: demo of the C++ warping API
- ogr2ogr: yes admitedly a lot of stuff, that use the OGR API. At the API 
level, there's some "competition" with the CopyLayer() API that lags behind 
all the advanced options of ogr2ogr. I guess defining some pipeline / chaining 
of operations could make sense on the paper, although there's a lot of nasty 
logic to deal with. Some switches have effects both on layer and field 
creations 
and on each feature.

* Thin wrappers over API :
- gdal_contour: wrapper of GDALCountourGenerate()
- gdaladdo : wrapper of GDALBuildOverviews()
- gdal_rasterize: wrapper of GDALRasterizeGeometries()

* Algorithms implemented in the utilities :
- gdalbuildvrt
- gdaldem
- nearblack
==> would really benefit to be librarified

* In the UNIX pipe philosophy:
- gdallocationinfo
- gdaltransform
==> not sure if they're worth being librarified

* Informational utilities
- gdalinfo ==> makes sense as a library function with the json output
- ogrinfo ==> less obvious. or perhaps just the metadata as json
- gdalsrsinfo / testepsg ==> probably not worth being librarified

I can understand the aim of orthogonal algorithms, etc.. but I'm not 
completely clear on how that would translate in code ;-)

Even

-- 
Spatialys - Geospatial professional services
http://www.spatialys.com
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] SRS of the Elastic Search geometries?

2015-09-01 Thread Jukka Rahkonen
Even Rouault  spatialys.com> writes:

> 
> Hi Jukka,


> > but I can still save for example
> > EPSG:3067 geometries without transforming them into EPSG:4326.
> 
> On-the-fly reprojection should occur normally (provided that your source
layer 
> has explicit SRS)


Probably that's it, the explicit SRS. I started with data that have native
SRS epsg:3067 and OpenJUMP JML format which has no means for holding SRS.
Conversion into ES gives always an empty layer, just the schema gets
inserted but no geometries, even is I use -s_srs and -t_srs.

Next trial was to convert JML into GML without assigning SRS. GML has an
undefined SRS and conversion without -s_srs and -t_srs leads to a situation
where metadata sayes that ES is using epsg:4326 but the features are in
epsg:3067. With proper -s_srs and -t_srs the result is good.

If I convert JML into GML and assign SRS with -a_srs epsg:3067 the result is
the same as above. Each GML feature has SRS defined as  but perhaps it is not explicit because conversion into
ES gives correct result only when -s_srs and -t_srs are defined.

So, two problems:
- No way at all to use Jump JML as input format
- With GML parameters -s_srs and -t_srs must be defined even if each GML
feature has srsName and all features in the GML file have same srsName.

A warning like "No explicit SRS found, use -s_srs and -t_srs" might be good
to have.

I know very little about Elastic Seaach, just started experiments a week
ago. I have concluded that what is layer for GDAL is "index", and what is
feature, is "document", and that documents are somehow saved as-is and
clever indexes are built on top of them. Therefore I was thinking that
perhaps everything is OK on the ES side even if the documents come out from
it with epsg:3067 geometries.

-Jukka-

-Jukka-

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] SRS of the Elastic Search geometries?

2015-09-01 Thread Even Rouault
> 
> Probably that's it, the explicit SRS. I started with data that have native
> SRS epsg:3067 and OpenJUMP JML format which has no means for holding SRS.
> Conversion into ES gives always an empty layer, just the schema gets
> inserted but no geometries, even is I use -s_srs and -t_srs.
> 
> Next trial was to convert JML into GML without assigning SRS. GML has an
> undefined SRS and conversion without -s_srs and -t_srs leads to a situation
> where metadata sayes that ES is using epsg:4326 but the features are in
> epsg:3067. With proper -s_srs and -t_srs the result is good.

You can also use -a_srs for that.

> 
> If I convert JML into GML and assign SRS with -a_srs epsg:3067 the result
> is the same as above. Each GML feature has SRS defined as  srsName="EPSG:3067"> but perhaps it is not explicit because conversion
> into ES gives correct result only when -s_srs and -t_srs are defined.
> 
> So, two problems:
> - No way at all to use Jump JML as input format

Are you sure about that ? I've just tried the following and it works:

$ ogr2ogr simple_EPSG_32631.jml simple_EPSG_32631.shp -f jml
$ ogr2ogr  -update es: simple_EPSG_32631.jml -overwrite -s_srs EPSG:32631 -
t_srs EPSG:32631
(or with -a_srs EPSG:32631)

The behaviour with JML and GML inputs should be the same.

> - With GML parameters -s_srs and -t_srs must be defined even if each GML
> feature has srsName and all features in the GML file have same srsName.

The issue with GML is complex. Basically, the GML driver is very cautious and 
doesn't report a layer SRS since there's no way to know if all features have 
the same SRS without exploring the whole file, which he will avoid if there's a 
.xsd that it can understand. If you remove the .xsd, then a full initial scan 
will be done and if the driver realizes that all features have the same SRS, 
this will be written in the .gfs file, and thus reported as the SRS layer.
With some particular formulations of GML (like a WFS GetFeature response where 
we know that the SRS is homogenous), the driver will use the SRS of the top 
Envelope as the layer SRS.

> 
> A warning like "No explicit SRS found, use -s_srs and -t_srs" might be good
> to have.

Should be addressed by my latest commit:
"ES: add warnings at layer creation if no SRS is specified, and at feature 
insertion, in the case where no layer SRS is defined when the bounding box 
isn't valid for long/lat"

> 
> I know very little about Elastic Seaach, just started experiments a week
> ago. I have concluded that what is layer for GDAL is "index", and what is
> feature, is "document", and that documents are somehow saved as-is

That really depends. You may choose to store or not the _all fields or 
individual fields.

> and
> clever indexes are built on top of them. Therefore I was thinking that
> perhaps everything is OK on the ES side even if the documents come out from
> it with epsg:3067 geometries.

The spatial indices made assumption on the coordinates. For example a geohash 
really assumes the [-90,90]x[-180,180] range. Not sure how geo_shape indexing 
works.

> 
> -Jukka-
> 
> -Jukka-
> 
> ___
> gdal-dev mailing list
> gdal-dev@lists.osgeo.org
> http://lists.osgeo.org/mailman/listinfo/gdal-dev

-- 
Spatialys - Geospatial professional services
http://www.spatialys.com
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] gdal_polygonize.py not running

2015-09-01 Thread Giorgio Ghiggini
Hi,

 

I am using the GISInternals “release-1800-x64-gdal-1-11-1-mapserver-6-4-1”
and I run the SDKShell.bat.

Then I tried running the gdal_polygonize.py as is and I got the following
message:


-

C:\GEC\Software\GDAL\gisinternals>gdal_polygonize.py

"gdal_polygonize.py" non è riconosciuto come comando interno o esterno,

 un programma eseguibile o un file batch.


-

In Italian, that means “command not recognized” so I thought the SDKShell is
not setting the path for this script.

 

I tried then running the command from the script folder as follow, but got
several errors:


-

C:\GEC\Software\GDAL\gisinternals\bin\gdal\python\scripts>gdal_polygonize.py
C:\GEC\Data\Slopes\testgdaldem-combined.tif -f "ESRI Shapefile"
C:\GEC\Data\Slopes\testgdalpolygonize.shp

Traceback (most recent call last):

  File
"C:\GEC\Software\GDAL\gisinternals\bin\gdal\python\scripts\gdal_polygoniz

e.py", line 34, in 

from osgeo import gdal, ogr, osr

  File
"C:\GEC\Software\GDAL\gisinternals\bin\gdal\python\osgeo\__init__.py", li

ne 21, in 

_gdal = swig_import_helper()

  File
"C:\GEC\Software\GDAL\gisinternals\bin\gdal\python\osgeo\__init__.py", li

ne 17, in swig_import_helper

_mod = imp.load_module('_gdal', fp, pathname, description)

  File "C:\Python34\lib\imp.py", line 243, in load_module

return load_dynamic(name, filename, file)

ImportError: DLL load failed: Impossibile trovare il modulo specificato.


-

 

 

To me it looks like the script is not able to load other external files
(gdal, ogr, osr…), so I still think it is a matter of PATH setting.

 

Any help is very much appreciated.

 

Best regards,

Giorgio

 

 

Giorgio Ghiggini

 

GEC di Ghiggini Giorgio

Via Monte Matanna 1/C

55049 Viareggio (LU)

Italy

P.Iva: 01381570116

cell.: +39 331 141 9315

email: gghigg...@gec-it.com

www.gec-it.com

  www.globalterramaps.com

www.globalaquamaps.com  

 

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] hdf5 dataset. Header data too long. Truncated

2015-09-01 Thread Rashad M
On Tue, Sep 1, 2015 at 5:08 PM, Even Rouault 
wrote:

> Le mardi 01 septembre 2015 16:50:32, Rashad M a écrit :
> > Hello all,
> >
> > I have an the following warning from gdal when reading .h5 dataset
> >
> > Warning 2: Header data too long. Truncated.
> >
> > Looking at frmts/hdf5/hdf5dataset.cpp there is a define for
> > MAX_METADATA_LEN
> >
> > #define MAX_METADATA_LEN 32768
> >
> >
> > and there is a check
> > if( CPLStrlcat(szValue, szData, MAX_METADATA_LEN) >=
> >
> > MAX_METADATA_LEN )
> > CPLError( CE_Warning, CPLE_OutOfMemory,
> >   "Header data too long. Truncated\n");
> >
>
> This was introduced in https://trac.osgeo.org/gdal/changeset/20438 to
> avoid
> crashes. Potentially it could be reworked to deal with arbitrary long
> metadata. Although it could be kind of annoying to have too long values as
> metadata item, especially in the main metadata domain. A cleaner solution
> would perhaps be have something like "MY_ITEM= large. query HDF5 metadata domain to get it>" and you'd have the full
> value in
> the HDF5 metadata domain (perhaps not unlimited but with a much higher
> limit,
> let's say 1 MB per item). People with more experience with HDF5 datasets
> could
> perhaps comment.
>

All of the truncated messages are coming when reading double. Here there is
 %.15g. Isn't that too much ?  Could it be reduced or change to simply %g
for double.?
https://trac.osgeo.org/gdal/browser/branches/1.11/gdal/frmts/hdf5/hdf5dataset.cpp#L800


>
> > Now I could update MAX_METADATA_LEN but is there a way it can set be set
> at
> > runtime ?
>
> No, you need to recompile.
>
> --
> Spatialys - Geospatial professional services
> http://www.spatialys.com
>



-- 
Regards,
   Rashad
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] SRS of the Elastic Search geometries?

2015-09-01 Thread Rahkonen Jukka (MML)
Even Rouault wrote:

>>
>> Probably that's it, the explicit SRS. I started with data that have native
>> SRS epsg:3067 and OpenJUMP JML format which has no means for holding SRS.
>> Conversion into ES gives always an empty layer, just the schema gets
>> inserted but no geometries, even is I use -s_srs and -t_srs.
>>
>> Next trial was to convert JML into GML without assigning SRS. GML has an
>> undefined SRS and conversion without -s_srs and -t_srs leads to a situation
>> where metadata sayes that ES is using epsg:4326 but the features are in
>> epsg:3067. With proper -s_srs and -t_srs the result is good.

> You can also use -a_srs for that.

I made a test and it appears to work but isn't is confusing because it makes a 
different effect than with most other drivers? Manual page 
http://www.gdal.org/ogr2ogr.html tells "-a_srs srs_def: Assign an output 
SRS " . With Elastic Search if you use -a_srs epsg:3067 the output SRS will 
actually be epsg:4326. Let's keep this secret and teach the users always use 
explicit -s_srs  and t_srs.

>>
>> If I convert JML into GML and assign SRS with -a_srs epsg:3067 the result
>> is the same as above. Each GML feature has SRS defined as > srsName="EPSG:3067"> but perhaps it is not explicit because conversion
>> into ES gives correct result only when -s_srs and -t_srs are defined.
>
>> So, two problems:
>> - No way at all to use Jump JML as input format

> Are you sure about that ? I've just tried the following and it works:
I am sure that I made a wrong conclusion. The problem is related to JML but it 
is somehow special. I took the test data from a real world dataset but I attach 
a minimal sample JML file for reproducing the issue.

> $ ogr2ogr simple_EPSG_32631.jml simple_EPSG_32631.shp -f jml
| $ ogr2ogr  -update es: simple_EPSG_32631.jml -overwrite -s_srs EPSG:32631 -
t_srs EPSG:32631
(or with -a_srs EPSG:32631)

> The behaviour with JML and GML inputs should be the same.
My test data in JML format contains this DATETIME 
 2015-06-14T22:29:15.454+0300

It leads to this ElasticSearch error which I took from the ES console:
Caused by: org.elasticsearch.index.mapper.MapperParsingException: failed to pars
e date field [2015/06/14 22:29:15.454+03], tried both date format [/MM/dd HH
:mm:ss.SSS||/MM/dd], and timestamp number with locale []

When I convert JML into GML with ogr2ogr the DATETIME comes into GML as
 2015/06/14 22:29:15.454+03
This is OK for Elastic Search. I do not see any difference but there must be 
some. JML failed for me because of this parsing error. ogr2ogr does not catch 
the error, not even when run with --debug on.


>> - With GML parameters -s_srs and -t_srs must be defined even if each GML
>> feature has srsName and all features in the GML file have same srsName.

> The issue with GML is complex. Basically, the GML driver is very cautious and
> doesn't report a layer SRS since there's no way to know if all features have
> the same SRS without exploring the whole file, which he will avoid if there's 
> a
> .xsd that it can understand. If you remove the .xsd, then a full initial scan
> will be done and if the driver realizes that all features have the same SRS,
> this will be written in the .gfs file, and thus reported as the SRS layer.
> With some particular formulations of GML (like a WFS GetFeature response where
> we know that the SRS is homogenous), the driver will use the SRS of the top
> Envelope as the layer SRS.

>
>> A warning like "No explicit SRS found, use -s_srs and -t_srs" might be good
>> to have.

> Should be addressed by my latest commit:
> "ES: add warnings at layer creation if no SRS is specified, and at feature
> insertion, in the case where no layer SRS is defined when the bounding box
> isn't valid for long/lat"

>
>> I know very little about Elastic Seaach, just started experiments a week
>> ago. I have concluded that what is layer for GDAL is "index", and what is
>> feature, is "document", and that documents are somehow saved as-is

> That really depends. You may choose to store or not the _all fields or
> individual fields.

>> and
>> clever indexes are built on top of them. Therefore I was thinking that
>> perhaps everything is OK on the ES side even if the documents come out from
>> it with epsg:3067 geometries.

> The spatial indices made assumption on the coordinates. For example a geohash
> really assumes the [-90,90]x[-180,180] range. Not sure how geo_shape indexing
> works.

-Jukka-


date_error.jml
Description: date_error.jml
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Call for discussion on RFC 59 (v2): GDAL/OGR utilities as a library

2015-09-01 Thread Ari Jolma

On 01.09.2015 16:34, Even Rouault wrote:

2)  Variant of 1). With some Python magic on **kwargs it can be automated to 
redirect on 1)
mem_ds = gdal.Translate('', src_ds, bands = [1,2,3], format = 'MEM', progress = 
my_progress_method)




* Other binding languages.


- Perl: apparently possible


Yes it is. Being a non-typed high level language it is possible and IMO 
preferable to be able to give the options in a key-value list (hash) 
where the keys are well known strings and the values are understandable. 
However, the C API need not do this but it should allow it.


Ari




Opinions ?

Even



Hi Frank,

I was one of the original people who argued against the "array of strings"
approach...

On 27 August 2015 at 02:26, Frank Warmerdam  wrote:

I clearly should have been commenting sooner.

Several months ago :p


I am concerned that having messy structures of options for each
program is going to complicate maintaining the actually commandline
programs, and that it will still be a fragile and complicated point of
entry as commandline arguments evolve over time.

The commandline tools eventually become string-parsing and wrapping of the
corresponding library tool - I'm not sure that makes it fragile &
complicated? Means that the library-ified apps are *at least as*
flexible/expressive/powerful
as the command-line tools.


I'd prefer if the approach had just been to embed the main()'s in a
library and to still pass the exact same vector of arguments (in the
char **argv format) to these functions instead of shelling out a
program.

That kinda defeats the whole point - a huge array of complex string-ified
arguments is what we're all doing at the moment, wrapped in an subprocess
call. Some options take multiple arguments in multiple strings, others take
multiple arguments in single strings, it's massively confusing. And we all
have big pipelines of chained gdalwarp/gdal_translate/etc code...

What we were striving for was to make it distinctly *better*:

- progress/logging/error handling
- options that accept geometries or SRS or in-memory datasets without
having to re-serialize them and/or utilise tempfiles
- easily applying the same operations over multiple datasets
- configuration option defaults


I would love to be able to replace many places where I shell out to
run gdal command line programs with a library call with essentially
the same arguments.

Sure, so it should be straightforward to do that *as well*, though besides
in-memory data (as you mention) you're getting very little benefit.

Rob :)


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] SRS of the Elastic Search geometries?

2015-09-01 Thread Even Rouault
> I made a test and it appears to work but isn't is confusing because it
> makes a different effect than with most other drivers? Manual page
> http://www.gdal.org/ogr2ogr.html tells "-a_srs srs_def: Assign an
> output SRS " . With Elastic Search if you use -a_srs epsg:3067 the output
> SRS will actually be epsg:4326. Let's keep this secret and teach the users
> always use explicit -s_srs  and t_srs.

Well, it assigns the SRS before the driver enter into action. Drivers tend to 
do nasty things due to format constraints ;-) But yes, that's a particular 
case.

> 
> My test data in JML format contains this DATETIME
>  2015-06-14T22:29:15.454+0300
> 
> It leads to this ElasticSearch error which I took from the ES console:
> Caused by: org.elasticsearch.index.mapper.MapperParsingException: failed to
> pars e date field [2015/06/14 22:29:15.454+03], tried both date format
> [/MM/dd HH
> 
> :mm:ss.SSS||/MM/dd], and timestamp number with locale []

OK there was indeed an issue with the declared format not allowing timezones. 
Just fixed.

> 
> When I convert JML into GML with ogr2ogr the DATETIME comes into GML as
>  2015/06/14 22:29:15.454+03
> This is OK for Elastic Search. I do not see any difference but there must
> be some. 

Yes, the GML driver does not support yet properly DateTime fields. So they are 
converted just as string...

> JML failed for me because of this parsing error. ogr2ogr does not
> catch the error, not even when run with --debug on.

Indeed the bulk uploading code didn't detect errors reported by the server. 
Now errors are reported back.

Thanks for your testing and reports !

-- 
Spatialys - Geospatial professional services
http://www.spatialys.com
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev