[gdal-dev] gdal_translate segfault with large netCDF

2011-04-19 Thread Josh.Vote
Hi,

I'm new to GDAL so please forgive any glaring ignorance :)

Currently I have an 8GB ER Mapper (ERS) dataset that I want to convert to a 
NetCDF file with gdal_translate which always results in a segfault when using 
the following command.

gdal_translate -of netCDF input.ers output.nc

Whereas translating only a small 4B subset of the dataset works fine.

Now I've been doing a bit of reading and I know that netcdf 3.6.2 and below 
doesn't support variables greater than 4GB however I've been doing the 
translations with the Debian libnetcdf6 package (which I believe includes 
netCDF 4.1.1 and running 'ncgen -version' confirms this). I am operating under 
the impression that netCDF 4.1.1 should be able to handle netCDF files of this 
size without trouble.

Now I've tested gdal_translate from a variety of builds and they all produce 
the same problem

* gdal-trunk revision 22206 - built from source

* gdal 1.8 - built from source

* Using the Debian stable/testing gdal-bin binary packages which bundle 
gdal 1.6.3-4 and gdal 1.7.3-2 respectively.
Am I doing anything wrong/operating under incorrect pretenses regarding netCDF 
 4GB files or is this a genuine problem with gdal_translate? Also are there 
any logs I could examine for more information?

Thanks for any help
Josh Vote



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] gdal_translate segfault with large netCDF

2011-04-19 Thread Frank Warmerdam

On 11-04-19 05:01 AM, josh.v...@csiro.au wrote:

Hi,

I’m new to GDAL so please forgive any glaring ignorance J

Currently I have an 8GB ER Mapper (ERS) dataset that I want to convert to a
NetCDF file with gdal_translate which always results in a segfault when using
the following command.

gdal_translate -of netCDF input.ers output.nc

Whereas translating only a small 4B subset of the dataset works fine.

Now I’ve been doing a bit of reading and I know that netcdf 3.6.2 and below
doesn’t support variables greater than 4GB however I’ve been doing the
translations with the Debian libnetcdf6 package (which I believe includes
netCDF 4.1.1 and running ‘ncgen –version’ confirms this). I am operating under
the impression that netCDF 4.1.1 should be able to handle netCDF files of this
size without trouble.

Now I’ve tested gdal_translate from a variety of builds and they all produce
the same problem


Josh,

I don't see any immediately obvious problems in the way we call the netcdf
API in netcdfdataset.cpp's CreateCopy method.  I would suggest you file a
ticket on the issue, and a developer can try to reproduce the problem.

I would like to suggest that you do a gdal_translate from a subset of the
ERS file at the bottom right corner of the source just to ensure that it
isn't a problem with reading past the 4GB mark in the ERS file.

I seem to have 3.6.3 of the netcdf library so I can't trivially check
this on my system.

Best regards,
--
---+--
I set the clouds in motion - turn up   | Frank Warmerdam, warmer...@pobox.com
light and sound - activate the windows | http://pobox.com/~warmerdam
and watch the world go round - Rush| Geospatial Programmer for Rent

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] gdal_translate segfault with large netCDF

2011-04-19 Thread Kyle Shannon
Josh,
As frank said, file a ticket and provide the output of ncdump -h
yourfile.ncwith the ticket.  I will take a look at it as soon as I
can, although I am
pretty busy.  Thanks.

kss

/**
 *
 * Kyle Shannon
 * ksshan...@gmail.com
 *
 */




On Tue, Apr 19, 2011 at 09:57, Frank Warmerdam warmer...@pobox.com wrote:

 On 11-04-19 05:01 AM, josh.v...@csiro.au wrote:

 Hi,

 I’m new to GDAL so please forgive any glaring ignorance J

 Currently I have an 8GB ER Mapper (ERS) dataset that I want to convert to
 a
 NetCDF file with gdal_translate which always results in a segfault when
 using
 the following command.

 gdal_translate -of netCDF input.ers output.nc

 Whereas translating only a small 4B subset of the dataset works fine.

 Now I’ve been doing a bit of reading and I know that netcdf 3.6.2 and
 below
 doesn’t support variables greater than 4GB however I’ve been doing the
 translations with the Debian libnetcdf6 package (which I believe includes
 netCDF 4.1.1 and running ‘ncgen –version’ confirms this). I am operating
 under
 the impression that netCDF 4.1.1 should be able to handle netCDF files of
 this
 size without trouble.

 Now I’ve tested gdal_translate from a variety of builds and they all
 produce
 the same problem


 Josh,

 I don't see any immediately obvious problems in the way we call the netcdf
 API in netcdfdataset.cpp's CreateCopy method.  I would suggest you file a
 ticket on the issue, and a developer can try to reproduce the problem.

 I would like to suggest that you do a gdal_translate from a subset of the
 ERS file at the bottom right corner of the source just to ensure that it
 isn't a problem with reading past the 4GB mark in the ERS file.

 I seem to have 3.6.3 of the netcdf library so I can't trivially check
 this on my system.

 Best regards,
 --

 ---+--
 I set the clouds in motion - turn up   | Frank Warmerdam,
 warmer...@pobox.com
 light and sound - activate the windows | http://pobox.com/~warmerdam
 and watch the world go round - Rush| Geospatial Programmer for Rent

 ___
 gdal-dev mailing list
 gdal-dev@lists.osgeo.org
 http://lists.osgeo.org/mailman/listinfo/gdal-dev

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] gdal_translate segfault with large netCDF

2011-04-19 Thread Nikolaos Hatzopoulos
What kind of netcdf file is causing the problem? is a netcdf4 or netcdf3
file?

there is a compiling option in netcdf4  --enable-netcdf-4


On Tue, Apr 19, 2011 at 11:35 AM, Kyle Shannon ksshan...@gmail.com wrote:

 Josh,
 As frank said, file a ticket and provide the output of ncdump -h
 yourfile.nc with the ticket.  I will take a look at it as soon as I can,
 although I am pretty busy.  Thanks.

 kss

 /**
  *
  * Kyle Shannon
  * ksshan...@gmail.com
  *
  */




 On Tue, Apr 19, 2011 at 09:57, Frank Warmerdam warmer...@pobox.comwrote:

 On 11-04-19 05:01 AM, josh.v...@csiro.au wrote:

 Hi,

 I’m new to GDAL so please forgive any glaring ignorance J

 Currently I have an 8GB ER Mapper (ERS) dataset that I want to convert to
 a
 NetCDF file with gdal_translate which always results in a segfault when
 using
 the following command.

 gdal_translate -of netCDF input.ers output.nc

 Whereas translating only a small 4B subset of the dataset works fine.

 Now I’ve been doing a bit of reading and I know that netcdf 3.6.2 and
 below
 doesn’t support variables greater than 4GB however I’ve been doing the
 translations with the Debian libnetcdf6 package (which I believe includes
 netCDF 4.1.1 and running ‘ncgen –version’ confirms this). I am operating
 under
 the impression that netCDF 4.1.1 should be able to handle netCDF files of
 this
 size without trouble.

 Now I’ve tested gdal_translate from a variety of builds and they all
 produce
 the same problem


 Josh,

 I don't see any immediately obvious problems in the way we call the netcdf
 API in netcdfdataset.cpp's CreateCopy method.  I would suggest you file a
 ticket on the issue, and a developer can try to reproduce the problem.

 I would like to suggest that you do a gdal_translate from a subset of the
 ERS file at the bottom right corner of the source just to ensure that it
 isn't a problem with reading past the 4GB mark in the ERS file.

 I seem to have 3.6.3 of the netcdf library so I can't trivially check
 this on my system.

 Best regards,
 --

 ---+--
 I set the clouds in motion - turn up   | Frank Warmerdam,
 warmer...@pobox.com
 light and sound - activate the windows | http://pobox.com/~warmerdam
 and watch the world go round - Rush| Geospatial Programmer for Rent

 ___
 gdal-dev mailing list
 gdal-dev@lists.osgeo.org
 http://lists.osgeo.org/mailman/listinfo/gdal-dev



 ___
 gdal-dev mailing list
 gdal-dev@lists.osgeo.org
 http://lists.osgeo.org/mailman/listinfo/gdal-dev

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

RE: [gdal-dev] gdal_translate segfault with large netCDF

2011-04-19 Thread Josh.Vote
Thanks for the suggestions -

 I would like to suggest that you do a gdal_translate from a subset of the
 ERS file at the bottom right corner of the source just to ensure that it
 isn't a problem with reading past the 4GB mark in the ERS file.

I just managed to run 'gdal_translate -of netCDF -srcwin 5 4 100 100 
input.ers output.nc' without issue (the input ERS dataset is 50592 x 41876)

 Josh,
 As frank said, file a ticket and provide the output of ncdump -h 
 yourfile.nc with the ticket.  I will take a look at it as soon as I can, 
 although I am pretty busy.  Thanks.

I may have misled people with the subject of the message, sorry about that. The 
issue is translating a large ER Mapper file into an equally large netCDF file 
(reading .ers and writing .nc)

http://trac.osgeo.org/gdal/ticket/4047

I've attached the ERS dataset header to the issue for reference, please let me 
know if you need more info.

 What kind of netcdf file is causing the problem? is a netcdf4 or netcdf3 file?
 
 there is a compiling option in netcdf4  --enable-netcdf-4

Will this affect writing a netCDF? Sorry again if I've misled you, the issue is 
reading an ER Mapper (.ers) file and writing a NetCDF.

Thanks again everyone for your assistance.
Josh Vote
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev