[gdal-dev] Warping with +proj=ob_tran

2014-11-26 Thread Knut-Frode Dagestad
Hi,



I have some NetCDF files tagged with this proj4 string:

proj4=+proj=ob_tran +o_proj=longlat +lon_0=-40 +o_lat_p=22 +R=6.371e+06 +no_defs



As ob_tran seems not supported, is there any way I can warp this onto

a regular (unrotated) lonlat-grid?



There is a long, 3 yr old discussion on this in [1], but the

conclusion is not clear to me.



[1] https://trac.osgeo.org/gdal/ticket/4285



Best regards from Knut-Frode


  ___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Warping with GCPs at high latitudes

2013-01-25 Thread Knut-Frode Dagestad

On 24. jan. 2013 20:43, Even Rouault wrote:

Did you try to do your suggestion at hand to actually check that your idea
leads to the expected result ? This is basically a matter of running :

1) gdaltransform -s_srs EPSG:4326 -t_srs +proj=stere +lat_0=90 +lon_0=0 on
the coordinates of the GCPs of the source image
2) gdal_translate -a_srs  +proj=stere +lat_0=90 +lon_0=0  [-gcp x y X Y]*
src.tif src_reprojected_gcp.tif where x y come from the source image and X Y
are the reprojected coordinates
3) gdalwarp src_reprojected_gcp.tif out.tif


I tried it now with Python API, and it works perfectly, all the way to 
the pole.


Thus the conclusion should be that GCPs can have any spatial reference 
system, but preferably one with no singularity within or near the 
destination area.


I used code like this:
# Make tranformer from GCP SRS to destination SRS
dstSRS = osr.SpatialReference()
dstSRS.ImportFromProj4(srsString)
srcSRS = osr.SpatialReference()
srcGCPProjection = srcDataset.GetGCPProjection()
srcSRS.ImportFromWkt(srcGCPProjection)
transformer = osr.CoordinateTransformation(srcSRS, dstSRS)

# Reproject all GCPs
srcGCPs = self.vrt.dataset.GetGCPs()
dstGCPs = []
for srcGCP in srcGCPs:
	(x, y, z) = transformer.TransformPoint(srcGCP.GCPX, srcGCP.GCPY, 
srcGCP.GCPZ)
	dstGCP = gdal.GCP(x, y, z, srcGCP.GCPPixel, srcGCP.GCPLine, 
srcGCP.Info, srcGCP.Id)

dstGCPs.append(dstGCP)

# Update dataset
self.vrt.dataset.SetGCPs(dstGCPs, dstSRS.ExportToWkt())

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Warping with GCPs at high latitudes

2013-01-24 Thread Knut-Frode Dagestad

Hi,

I am having trouble to perform accurate warping of images with GCPs 
(lon,lat) in the high Arctic (to +proj=stere +lat_0=90 +lon_0=0); 
problems starting already around 80 degrees of latitude.
Thin Plate Spline (-tps) performs better than polynomial fits, but 
output is often severely dragged at one or two edges, and sometimes it 
looks like the destination image is mirrored across a line.
Trimming images at the edges and adjusting warp options SAMPLE_STEPS and 
SAMPLE_GRID helps a little, but not sufficiently.


For the polynomial fits I assume the problem is related to precision due 
to the fact that lon and lat become singular at the pole. However, it 
should then probably be better if the algorithm could:
- convert first GCPs to the output projection (in my case stereographic, 
which has no singularity for the relevant images)

- make the polynomial fit GCP_stereo - pixel, line
- use this polynomial for the warping

I guess most GDAL users have a long wishlist of switches, but how about 
a switch to control whether GCP polynomial should rather be made in 
destination projection? Or a switch to gdal_translate to change the 
GCPProjection and GCP coordinates. A simpler and more pragmatic solution 
could be to make a separate Python script gdal_translate_gcps.py?


Are there other suggestion to deal with these problems?


Best regards from Knut-Frode

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] gdalwarp performance with many GCPs

2012-12-12 Thread Knut-Frode Dagestad

Hi list,

When warping images with many GCPs, the -tps switch (Thin Plate Spline) 
is found to be necessary to get decent accuracy. This makes however 
warping very slow. The only method I found to increase speed is the -et 
switch, but at cost of spatial accuracy.



Below some comments about the other tries which did not help:

- compiling GDAL with armadillo support had no effect on speed.
Btw, to compile with armadillo I had to manually insert -llapack in the 
following line in configure:
if test -z `${CXX} testarmadillo.cpp -o testarmadillo -larmadillo 
21` ; then .

Perhaps ${LIBS} should be added permanently to this line in trunk?

- compiling GDAL with OpenCL had also no effect. At first surprising, 
but looking at the opencl warp kernel it seems that it only makes a 
difference for other resampling algorithms than nearest neighbour?


- Increasing  memory with -wm had no effect

- Using several threads (-multi -wo NUM_THREADS=ALL_CPUS) actually 
increased computing time significantly (my proj version is 4.7.1).


From the debug output, a lot of the time is apparently spent on:
WARP: Copying metadata from first source to destination dataset
Is this an indication that much time is simply spent on reading and 
writing the GCPs to/from file? If so, we could perhaps expect improved 
performance if Geolocation Arrays could be used instead (not possible 
due to http://trac.osgeo.org/gdal/ticket/4907).



The tests are made with the following file and command on Ubuntu:
http://dl.dropbox.com/u/15885758/testgcp.tif  (3212 GCPs and 2048x2511 
pixels covering Southern Europe)


time gdalwarp --debug on -et 5 -tps -t_srs '+proj=merc' testgcp.tif 
out.tif  (+ other swithces mentioned above)



Best regards from Knut-Frode

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] gdalwarp performance with many GCPs

2012-12-12 Thread Knut-Frode Dagestad

Hi Jan,

That sounds interesting and promising.

For the mentioned file it takes about 2 minutes with -et 5 (low 
accuracy), and 12 minutes without this switch. Without -tps it takes 
less than 0.5 seconds.

The machine is quite decent with 16 GB memory and SSD disk.

Best regards from Knut-Frode



On 12. des. 2012 13:24, Jan Hartmann wrote:

Hi Knut,

What do you mean by very slow? I regularly use gdalwarp -tps on much
larger rasters with a few thousand gcp-s, and never noticed unacceptable
delays. Do you have very little physical memory, or many parallel
processes running?

Jan

On 12/12/2012 01:12 PM, Knut-Frode Dagestad wrote:

Hi list,

When warping images with many GCPs, the -tps switch (Thin Plate
Spline) is found to be necessary to get decent accuracy. This makes
however warping very slow. The only method I found to increase speed
is the -et switch, but at cost of spatial accuracy.


Below some comments about the other tries which did not help:

- compiling GDAL with armadillo support had no effect on speed.
Btw, to compile with armadillo I had to manually insert -llapack in
the following line in configure:
if test -z `${CXX} testarmadillo.cpp -o testarmadillo -larmadillo
21` ; then .
Perhaps ${LIBS} should be added permanently to this line in trunk?

- compiling GDAL with OpenCL had also no effect. At first surprising,
but looking at the opencl warp kernel it seems that it only makes a
difference for other resampling algorithms than nearest neighbour?

- Increasing  memory with -wm had no effect

- Using several threads (-multi -wo NUM_THREADS=ALL_CPUS) actually
increased computing time significantly (my proj version is 4.7.1).

From the debug output, a lot of the time is apparently spent on:
WARP: Copying metadata from first source to destination dataset
Is this an indication that much time is simply spent on reading and
writing the GCPs to/from file? If so, we could perhaps expect improved
performance if Geolocation Arrays could be used instead (not possible
due to http://trac.osgeo.org/gdal/ticket/4907).


The tests are made with the following file and command on Ubuntu:
http://dl.dropbox.com/u/15885758/testgcp.tif  (3212 GCPs and 2048x2511
pixels covering Southern Europe)

time gdalwarp --debug on -et 5 -tps -t_srs '+proj=merc' testgcp.tif
out.tif  (+ other swithces mentioned above)


Best regards from Knut-Frode

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] gdalwarp performance with many GCPs

2012-12-12 Thread Knut-Frode Dagestad
The GCPs are completely regular, every 40 pixel in each direction. 
Warping quality is very good (only slow), so the GCPs are probably also ok.


I would appreciate very much if you could download this file and test 
the command below:

http://dl.dropbox.com/u/15885758/testgcp.tif

time gdalwarp -tps -t_srs '+proj=merc' testgcp.tif out.tif

If it works fast for you, the problem should be related to my local GDAL 
build (1.10).



On 12. des. 2012 13:40, Jan Hartmann wrote:

No, this is unaaceptable, but I cannot reproduce it. The only problem I
had was with control points that were too close together. Are yours
evenly distributed? Can you do tests with subsets of control points?

On 12/12/2012 01:27 PM, Knut-Frode Dagestad wrote:

Hi Jan,

That sounds interesting and promising.

For the mentioned file it takes about 2 minutes with -et 5 (low
accuracy), and 12 minutes without this switch. Without -tps it takes
less than 0.5 seconds.
The machine is quite decent with 16 GB memory and SSD disk.

Best regards from Knut-Frode



On 12. des. 2012 13:24, Jan Hartmann wrote:

Hi Knut,

What do you mean by very slow? I regularly use gdalwarp -tps on much
larger rasters with a few thousand gcp-s, and never noticed unacceptable
delays. Do you have very little physical memory, or many parallel
processes running?

Jan

On 12/12/2012 01:12 PM, Knut-Frode Dagestad wrote:

Hi list,

When warping images with many GCPs, the -tps switch (Thin Plate
Spline) is found to be necessary to get decent accuracy. This makes
however warping very slow. The only method I found to increase speed
is the -et switch, but at cost of spatial accuracy.


Below some comments about the other tries which did not help:

- compiling GDAL with armadillo support had no effect on speed.
Btw, to compile with armadillo I had to manually insert -llapack in
the following line in configure:
if test -z `${CXX} testarmadillo.cpp -o testarmadillo -larmadillo
21` ; then .
Perhaps ${LIBS} should be added permanently to this line in trunk?

- compiling GDAL with OpenCL had also no effect. At first surprising,
but looking at the opencl warp kernel it seems that it only makes a
difference for other resampling algorithms than nearest neighbour?

- Increasing  memory with -wm had no effect

- Using several threads (-multi -wo NUM_THREADS=ALL_CPUS) actually
increased computing time significantly (my proj version is 4.7.1).

From the debug output, a lot of the time is apparently spent on:
WARP: Copying metadata from first source to destination dataset
Is this an indication that much time is simply spent on reading and
writing the GCPs to/from file? If so, we could perhaps expect improved
performance if Geolocation Arrays could be used instead (not possible
due to http://trac.osgeo.org/gdal/ticket/4907).


The tests are made with the following file and command on Ubuntu:
http://dl.dropbox.com/u/15885758/testgcp.tif  (3212 GCPs and 2048x2511
pixels covering Southern Europe)

time gdalwarp --debug on -et 5 -tps -t_srs '+proj=merc' testgcp.tif
out.tif  (+ other swithces mentioned above)


Best regards from Knut-Frode

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] gdalwarp performance with many GCPs

2012-12-12 Thread Knut-Frode Dagestad

On 12. des. 2012 14:16, Jan Hartmann wrote:


It's far too slow here too, so it's not your build.

I see that the warp itself doesn't start but after a very long time, so
the problem lies in the initial computation of the transformation
matrix. It must have something to do with the distribution of points,
perhaps too close together, so as to cause singularities. The target
coordinates are latlon-values, very close numerically. I see two testing
possibilities:

1) Try to warp with subsets of points
2) Multiply all target coordinates with (e.g.) 1000. You'll get an
impossible georeference, but if this goes much faster, you know where
the problem lies.

Jan



Scaling lon and lat with 1000 did not help, but when using only every 
5th GCP in each direction it is very fast (3 seconds with -tps) and 
still with good accuracy (as long as I keep the border GCPs).

So thank you, Jan, for spotting the problem!

It might be useful with a -decrease-GCP-density-factor switch which the 
user could fine-tune for performance, or ideally that the warping 
routines would calculate the optimal factor directly.



Another thing: I originally wanted to warp using Python API 
(AutoCreateWarpedVRT), but it seems to have no option to use -tps (?) - 
so therefore I have to call gdalwarp as system command.


Knut-Frode

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Warping quality with small geolocation arrays

2012-11-26 Thread Knut-Frode Dagestad

Dear list,

Warping datasets with geolocation arrays works fine if the arrays have 
the same size as the other raster bands. However, if the geolocation 
arrays are much smaller, the quality is very bad.


Say I have a 2000x2000 px dataset with 100x100 px sized geolocation 
arrays. Then a warped image have only 200x200 blocks of identical 
values, even if the size of the warped image is much larger. If I change 
the interpolation algorithm (neareast, cubic...) the values change 
slightly, but the real output resolution remains low. This is the same 
whether using Python API or gdalwarp. As a workaround I convert the 
geolocation arrays to GCPs. This gives decent warping quality, but is a 
bit awkward.


Is this the expected behavior of the present implementation of 
geolocation arrays in GDAL?


Best regards from Knut-Frode

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] Warping quality with small geolocation arrays

2012-11-26 Thread Knut-Frode Dagestad

Frank,

I created a ticket with the same description as below:
http://trac.osgeo.org/gdal/ticket/4907

My test cases are from self-made Raw-VRTs, so I have no immediate 
reproducable test image/script to attach. However, if useful, I can 
construct an example to attach in a couple of days.


Thank you for looking into this.

Best regards from Knut-Frode



Den 26.11.2012 22:14, skrev Frank Warmerdam:

On 12-11-26 12:11 PM, Knut-Frode Dagestad wrote:

Dear list,

Warping datasets with geolocation arrays works fine if the arrays have
the same
size as the other raster bands. However, if the geolocation arrays are
much
smaller, the quality is very bad.

Say I have a 2000x2000 px dataset with 100x100 px sized geolocation
arrays.
Then a warped image have only 200x200 blocks of identical values, even
if the
size of the warped image is much larger. If I change the interpolation
algorithm (neareast, cubic...) the values change slightly, but the
real output
resolution remains low. This is the same whether using Python API or
gdalwarp.
As a workaround I convert the geolocation arrays to GCPs. This gives
decent
warping quality, but is a bit awkward.

Is this the expected behavior of the present implementation of
geolocation
arrays in GDAL?


Knut-Frode,

The intention is that the geolocation information will be linearly
interpolated from the geolocation array.  However, I gather you are not
seeing that effect.

Looking at the backmap it seems that this linear interpolation is not
being done, even though it is (after a fashion) in the forward case
transformation case.  I'd appreciate it if you could submit a bug on
this issue, and I'll look into correcting it.  It is a pretty serious
error.

Best regards,



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Temporal statistics

2012-07-13 Thread Knut-Frode Dagestad

Hi,

From a time series of colocated images, I construct a VRT with one band
for each time step using gdalbuildvrt:

 gdalbuildvrt -separate mtsat.vrt mtsat/*_IR*00.tif

Are there any tools that can be used to calculate some statistics (min,
max, mean, etc) versus time for each pixel of such a 3D Dataset?
The rasters are large and timeseries long, so reading everything into a
Python NumPy cube is not a good solution.

Thanks,
Knut-Frode

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Re: reading NOAA-19 AVHRR

2012-03-01 Thread Knut-Frode Dagestad

Even,

Your suggestion sounds plausible is and probably both correct and 
necessary. But unfortunately it is still not working with your patch, so 
I guess there must be an additional problem.
Regardless of using the patch or not, and adding --debug on or not, 
the output from gdalinfo is:


gdalinfo --debug on avhrr_20120227_235800_noaa19.hrp
ERROR 4: `avhrr_20120227_235800_noaa19.hrp' not recognised as a 
supported file format.

gdalinfo failed - unable to open 'avhrr_20120227_235800_noaa19.hrp'.

After investigations, I have found one potential reason: According to 
this page, this file (from Eumetcast antenna) is in Big Endian format:

http://www.eumetsat.int/Home/Main/Satellites/GroundNetwork/EARSSystem/EARS-AVHRR/index.htm?l=en
(- For NOAA KLM and NN': HRPT data, expanded to 16 bits, big endian, 
bzip2 compressed.)


I see there are some old discussions related to endianness in GDAL:
http://lists.osgeo.org/pipermail/gdal-dev/2002-April/003355.html

and it is also mentioned in the driver tutorial:
http://www.gdal.org/gdal_drivertut.html  (keyword bNativeOrder)

But I am not able to see how to fix this problem, and not sure if 
endianness is relevant at all? I could make available a sample file for 
anyone interested to look at the problem.


Best regards from Knut-Frode



On 28. feb. 2012 17:06, Even Rouault wrote:

Knut-Frode,

I've skimmed quickly through the driver code and if you are lucky, it is
probably just a matter of changing a few lines (well, I can be wrong).

If you have a debug build of GDAL, you can first try to run gdalinfo --debug on
avhrr_20120227_235800_noaa19.hrp . My guess (based on
http://en.wikipedia.org/wiki/NOAA-19 saying that NOAA-19 = NOAA-N' and NOOA-N'
appearing in commented code of the driver...) would be that you would see :
Unknown spacecraft ID 8.

If so, you could test the following patch :

Index: l1bdataset.cpp
===
--- l1bdataset.cpp  (revision 24036)
+++ l1bdataset.cpp  (working copy)
@@ -61,6 +61,7 @@
  NOAA16, // NOAA-16(L)
  NOAA17, // NOAA-17(M)
  NOAA18, // NOAA-18(N)
+NOAA19, // NOAA-19(N')
  METOP2  // METOP-2(A)
  };

@@ -1025,10 +1026,10 @@
  case 7:
  eSpacecraftID = NOAA18;
  break;
-/* FIXME: find appropriate samples and test these two cases:
- * case 8:
-eSpacecraftID = NOAA-N';
+case 8:
+eSpacecraftID = NOAA19;
  break;
+/* FIXME: find appropriate samples and test that case:
  case 11:
  eSpacecraftID = METOP-1;
  break;*/
@@ -1155,6 +1156,9 @@
  case NOAA18:
  pszText = NOAA-18(N);
  break;
+case NOAA19:
+pszText = NOAA-19(N');
+break;
  case METOP2:
  pszText = METOP-2(A);
  break;

Perhaps Andrey Kiselev, the author of the driver, would have better clues.

Best regards,

Even



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] reading NOAA-19 AVHRR

2012-02-28 Thread Knut-Frode Dagestad

Hi,

The GDAL AVHRR driver seems to work only for data from the older NOAA-9 
- NOAA-17 satellites: http://gdal.org/frmt_l1b.html


Hence it does not work for my NOAA-19 AVHRR files in HRP-format (from 
Eumetcast), which have filenames like avhrr_20120227_235800_noaa19.hrp


I have not able to find any software to convert such images to a format 
which can be read by GDAL.


Does anyone have some hints?

Best regards from Knut-Frode

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Converting between row/col numbers of two datasets

2012-01-18 Thread Knut-Frode Dagestad

Hi group,

In searching for the reason for the bug reported in ticket #4442 [1], I 
have made some tests with gdal.Transformer in Python:


A transformer defined generally as:
T = gdal.Transformer(src_ds, dst_ds, [''])
can be used to convert row/col from src-image to row/col of dst-image. 
Behind the scene, according to GDAL documentation [2], this consists of 
3 steps:


(source row/col) -- (source coordinates) -- (destination coordinates) 
-- (destination row/col)


My tests show that this works fine if the src-image has GCPs only and 
the dst-image has a geotransform, but not vice versa.
In the latter case, the output is not row/col of the GCP-image, but 
rather the coordinates (lon and lat). I.e., the last step of the chain 
above has not been performed. Enforcing the same projection for the two 
datasets makes no difference.
This should explain why in the tests of ticket #4442 the 
geotransform-image is placed in the upper left corner of the output 
GCP-image: the lon/lat-values are mistaken as destination row/col-values.


However, it is also possible to define a transformer which converts 
(back and forth) between row/column and coordinates of either the src- 
or dst-images:

T_src = gdal.Transformer(src_ds, None, [''])
T_dst = gdal.Transformer(dst_ds, None, [''])
These correspond to the first and last steps of the chain above. The 
interesting thing is that combining these it works well to calculate 
row/col of the GCP-image from row/col of the geotransform-image.


Can anyone explain why then the direct transformation does not work? I 
suspect a small fix could possibly solve this problem?


Best regards from Knut-Frode


[1] http://trac.osgeo.org/gdal/ticket/4442
[2] http://www.gdal.org/gdal__alg_8h.html#7671696d085085a0bfba3c3df9ffcc0a


The script below can be used to reproduce this:

GCPfile = 'MER_RR__2PNPDK20030809_100609_22512018_00466_07534_3898.N1'
# http://envisat.esa.int/services/sample_products/meris/RR/L2/

geotransformfile = 'europe.tif'# 
http://www.eea.europa.eu/data-and-maps/data/world-digital-elevation-model-etopo5/zipped-dem-geotiff-raster-geographic-tag-image-file

-format-raster-data/

# - Open source and destination datasets -#
if 0:
# This works
src_ds = gdal.Open(GCPfile)
dst_ds = gdal.Open(geotransformfile)
else:
# This does not work
src_ds = gdal.Open(geotransformfile)
dst_ds = gdal.Open(GCPfile)

# - Make transformers -#

# To tranform from pixel/line of src-image to pixel/line on dst-image:
T_full = gdal.Transformer(src_ds, dst_ds, [''])

# To tranform between pixel/line and coordinates on src-image
T_src = gdal.Transformer(src_ds, None, [''])

# To tranform between pixel/line and coordinates on dst-image
T_dst = gdal.Transformer(dst_ds, None, [''])

# - Test transformers -#
src_rowcol = (100, 100)
coords = T_src.TransformPoint(0, src_rowcol[0], src_rowcol[1])[1]
print 'Pixel ' + str(src_rowcol) + ' in src image has coordinates: ' + 
str(coords)


dst_rowcol = T_dst.TransformPoint(1, coords[0], coords[1])[1];
print 'These lon/lat-values corresponds to pixels in destination image: 
' + str(dst_rowcol)


print 'Direct transformation: ' + str(T_full.TransformPoint(0, 
src_rowcol[0], src_rowcol[1]))




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Longitude and latitude raster bands

2012-01-10 Thread Knut-Frode Dagestad

Hi,

For datasets with only GCPs for geolocation, is there any convenient and 
efficient method to extract latitude and longitude values as arrays of 
same size as the raster bands, using the Python API?


And similarly: is there an efficient method to return full arrays of the 
azimuth direction, e.g. defined as 0 for pixels where north is up (as 
it would always be in a lon-lat or Mercator projection).
This angle is relevant e.g. when one want to know the sensor look 
direction for satellite images.


Best regards from Knut-Frode

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Warping onto an image with only GCPs

2011-12-15 Thread Knut-Frode Dagestad

Hi,

Hróbjartur Þorsteinsson explained to me a near-hidden feature of GDAL, 
namely to reproject one image onto the coverage of another.


Given two images:
imageA.tiff
imageB.tiff

Then an empty image can be created from image A with e.g.
$ gdal_translate -ot Float32 -scale 0 0 999 999 -a_nodata 999 
imageA.tiff domainA.tiff


And then imageB can be warped onto the extent of this image simply with:
$ gdalwarp imageB.tiff domainA.tiff

domainA.tiff will then contain the data from imageB, exactly co-located 
with imageA.


The problem is that this only works when imageA contains a geotransform, 
and not when it contains only GCPs.



So my question is:
Is there any workaround to warp data from one image onto another image 
which is geolocated by GCPs only?




With the -to option of gdalwarp it is possible to pass some parameters 
to GDALCreateGenImgProjTransformer2(). In the documentation of this 
function I find the following option:


METHOD: may have a value which is one of GEOTRANSFORM, GCP_POLYNOMIAL, 
GCP_TPS, GEOLOC_ARRAY, RPC to force only one geolocation method to be 
considered on the source dataset.

http://www.gdal.org/gdal__alg_8h.html#94cd172f78dbc41d6f407d662914f2e3

But there is no corresponding option to force use of GCP_POLYNOMIAL for 
the *destination* dataset. I guess this is a bad sign?



Thank you for any help!
Knut-Frode

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Re: discussion on improvements to the NetCDF driver and CF-1 convention

2011-08-26 Thread Knut-Frode Dagestad

Etienne,

My group is also very much in support of the suggested improvements, as 
NetCDF/CF is becoming the most important standard for data 
storage/exchange in our community (climate, meteorology, oceanography). 
My group is not much into GDAL driver development (at least not yet), 
but we are now building our future very much around GDAL, so we are 
interested to join the discussion around improved support for NetCDF/CF 
reading and writing.


About some of the mentioned issues:

1) The latest version of CF is 1.5, so it would be better to target 1.5 
(or 1.x) than 1.0. Ideally the driver could be made generic for smooth 
upgrade to newer (or downgrade to older) versions of the CF conventions.


5) It sounds meaningful to assume WGS84 datum if not specified. I 
believe at least 99% of data typically stored in NetCDF/CF will have 
this datum, and even when that is not the case, positioning errors less 
than 100 m should normally have little impact for the types of data. 
Precision on datum level is probably more relevant for more typical 
GIS-data with roads, borders etc.

A warning could however be issued each time the driver has to assume WGS84.

7) We agree in particular that proper handling of time axis is of high 
value. This is one of the issues which is highly relevant for 
climate-data (often 4-dimensional), but not well supported by typical 
GIS-software or standards designed with 2-dimensional data in mind.
A general representation of a time-axis (or even vertical axis) in GDAL 
would be ideal, but that would probably be too much an effort for anyone 
to develop.



Knut-Frode
Nansen Environmental and Remote Sensing Center



On 24/08/2011 22:37, Etienne Tourigny wrote:

Hi all,

I would like to start a discussion with those interested about fixing
various issues in the NetCDF driver.

As it stands, it is not fully CF-1.0 compliant, and produces
geographical grids that are not valid for other software.
Also, the infamous up-side down problem has been addressed for
importing netcdfs, but needs to be addressed for exporting.

Related ticket: http://trac.osgeo.org/gdal/ticket/2129
I have been submitting patches for some time to fix the related issues.

I have identified a number of issues, of which the following have been
fixed partially in my proposed patch.

1- conform to the cf-1.0 standard for geographical grids (creating lat
and lon dimensions and variables)
2- make netcdf grids bottom-up by default, as all software out there
uses that convention
3- fix problems related to floating-point / string conversions (also
reported in issue #4200 )
4- fix metadata handling and duplication (as reported in issue #4204),
and keep band (variable) metadata outside of the global metadata

Pending issues:

5- how to deal with netcdfs which have no datum? Can we assume WGS84?
These files are extremely common, and CF-1.0 has provisions for
identifying the key variables, but no Proj/Wkt codes.
6- fix proper export of projected CRS (this is not fully necessary
though) with lat/lon values respecting CF-1.0
7- handle time axis properly (at least when creating a netcdf) - that
would be great for import/export of netcdf

Regards,
Etienne


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Re: How to get the ENVISAT N1 Main Processing parameters

2011-05-16 Thread Knut-Frode Dagestad

Hi RSyaoxin and Antonio,

I share your interest in these metadata. In addition to the MPH and SPH 
metadata in ASCII-format, Envisat files contain a lot of metadata in 
binary form, e.g. for ASAR:

http://envisat.esa.int/handbooks/asar/CNTR6-6.htm#eph.asar.asardf.asarrec

It would probably be too much to ask for the GDAL Envisat driver to 
import all of this. However, there are some missing metadata which are 
of special importance:


- the calibration factors, which RSyaoxin mentions
- the incidence angles: 
http://envisat.esa.int/handbooks/asar/CNTR6-6-9.htm#eph.asar.asardf.asarrec.ASAR_Geo_Grid_ADSR


Both of this is necessary to be able to calibrate the data, and hence to 
be able to make any geophysical calculations from them.
It would be really nice if the GDAL Envisat driver could be updated to 
read this as well, and most users would not need to rely on (or create) 
external software.


The incidence angles are stored in a small grid at the same size as the 
GCP´s, (order of 100 elements), which does not fit very nicely into the 
GDAL datamodel - it would probably have to be stored as a list of 
metadata strings.


Best regards from Knut-Frode



On 15/05/2011 18:18, Antonio Valentino wrote:

Hi RSyaoxin,

Il 15/05/2011 14:46, RSyaoxin ha scritto:

Hi,I want to calibration the ENVISAT ASAR DATA(.N1 Format). I need to get
it's main processing parameters,such as
calibration_factors[0].ext_cal_fact[0],first_line_tie_point_angles[0-10],and
I wrote: CString
sDomain=m_pDataset-GetMetadataItem(MAIN_PROCESSING_PARAMS_ADS_CALIBRATION_FACTORS[0].EXT_CAL_FACT[0]);but
I can't get result.
However, CString sDomain=m_pDataset-GetMetadataItem(SPH_FIRST_MID_LAT)
is right.That's Why?How should I go to get necessary information?
I turn to Anybody who once did this job for some advice. Thank you!
Kind regards.



It seems that the envisat driver do not extracts all metadata from all
ENVISAT records.
It just exposes MPH, SPH and GCPs

http://www.gdal.org/frmt_various.html#Envisat


From my point of view it would be very useful to be able to have access

to a richer set of metadata from GDAL (at east for SAR products).

If you are only interested in ENVISAT then you could give a try to

https://github.com/bcdev/epr-api

that give you full access to the ENVISAT products eve if, IMHO, it is
not so handly as GDAL for imagery IO.

Maybe you can use them both: EPR for metadata and GDAL for imagery.

best regards




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Re: VRT Derived Bands from Python?

2011-05-13 Thread Knut-Frode Dagestad

Hi Even and Antonio,

Thank you Even for explaining how to make the impossible possible!
I will explore this option.

The suggestion of Antonio to include a default set of PixelFunctions is 
an excellent idea, as it lowers the threshold for newcomers, like me. 
The proposed standard functions (#3367) should also serve many basic 
needs, and in fact, most of mine.
On the other hand, I do not see any apparent limitation to how 
sophisticated the PixelFunctions can be. I am not sure if it is a good 
idea, but I consider the option to put in relatively complex algorithms 
with loops, iterations and inversions. I.e. things that are better done 
in C than in Python.


Best regards from Knut-Frode



On 11/05/2011 21:05, Antonio Valentino wrote:

Hi Even, hi Knut-Frode,

Il 11/05/2011 20:08, Even Rouault ha scritto:

Le mercredi 11 mai 2011 11:53:55, Knut-Frode Dagestad a écrit :

Hi,

Derived Bands of the VRT format seems like a very powerful resource.
  From the tutorial (http://www.gdal.org/gdal_vrttut.html) it is clear
that the Pixel Functions themselves have to be written in C, which is
probably anyway best for performance.
I also get the impression that the corresponding VRT files with Derived
Bands can only be used by applications written in C/C++, since the Pixel
Functions are only available inside the scope of such an application
program:
http://gis.stackexchange.com/questions/2940/gdal-pixel-functions-question

Does this mean that Pixel Functions can not be used by command line
utilities (e.g. gdalwarp and gdal_translate) or from Python?


The quick answer is : you can't use VRTDerivedBand from command line
utilities or Python.

The long answer is : but of course, as often in situations where you can't,
you can... Provided you have a C compiler handy, the workaround here is to
benefit from the GDAL plugin mechanism for other purposes than writing a GDAL
driver.

Create a gdalmypixelfunction.c whose content is :

#include gdal.h

CPLErr MyFunction(void **papoSources, int nSources, void *pData,
 int nXSize, int nYSize,
 GDALDataType eSrcType, GDALDataType eBufType,
 int nPixelSpace, int nLineSpace)
{
/* implementation to write here */
}

void GDALRegisterMe() /* this is the entry point. don't change this name */
{
 GDALAddDerivedBandPixelFunc(MyFunction, MyFunction);
}

Compile it as a shared library, called gdal_.so (or .dll on
windows), define GDAL_DRIVER_PATH to point to the directory of the shared
library, et voilà ! The library will be loaded at runtime when the utilities
or the python bindings call the GDALRegisterAll() method and the pixel
function will be available to the VRT driver.


This is exactly the solution I'm currently using and I con confirm that
it works perfectly both on linux and windows.

I wrote a little plugin to register pixel functions attached to #3367
[1] (together with other more specific for my application domain) and
now both command line tools and python scripts can use pixel functions.

The only limitation is that there is no way, as far as I know, to write
a VRT file with a VRTDerivedRasterBand using the GDAL Python API.

While I can set the Band subClass at band creation time as follows:

ds.AddBand(gdal.GDT_Float32, ['subClass=VRTDerivedRasterBand'])

I was not able to find a way to set the PixelFunctionType and the
SourceTransferType parameters using the GDAL Python API.
As a workaround I write the VRT to disk and then I perform direct XML
manipulations via xml.etree (from the python standard lib).

Do I miss something?

It would be very nice to be able to set PixelFunctionType and
SourceTransferType asadditional creation options.


[1] http://trac.osgeo.org/gdal/ticket/3367

Best regards




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Re: VRT Derived Bands from Python?

2011-05-13 Thread Knut-Frode Dagestad

On 13/05/2011 12:07, Antonio Valentino wrote:


If you think it is useful I can attach the fake-driver code to the
#3367.


Thank you, that would be useful.

In the meantime I have tested this approach, nearly successful:
I copied one of your pixel-functions from #3367 and made a dynamic 
library file with:


gcc -fPIC -c gdal_nerscpixelfun.c
gcc -shared -lgdal -o gdal_nerscpixelfun.so gdal_nerscpixelfun.o

and then set GDAL_DRIVER_PATH to point to this folder.

However, at GDAL startup I get the following error:
ERROR 1: dlsym(0x100e095f0, _GDALRegister_nerscpixelfun): symbol not found

Any ideas what is wrong? OS is Mac OS X 10.6.7. I am not a C expert.

Best regards from Knut-Frode

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Re: VRT Derived Bands from Python?

2011-05-13 Thread Knut-Frode Dagestad

On 13/05/2011 19:07, Even Rouault wrote:


I had also a similar error on Linux, but it worked despite the error. I have
pushed a fix in trunk to silent that error. You can try to add a printf()
statement in your GDALRegisterMe function to check if it is loaded correctly.


You are right, it works despite the error message!
I did not even think about checking that...
Great, thanks alot once again!

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Interpolate one image onto another

2011-05-02 Thread Knut-Frode Dagestad

Hi!

If I have two overlapping raster images, how can I interpolate one of 
them to the exact coverage of the other image, so that they can be 
compared pixel by pixel?


I primarily want to do this with Python API, but tips on how to do it 
with commandline utilities are also welcome. The two images do generally 
have different map projections, and either can be georeferenced by GCP's 
or affine transformations.
It would also be an advantage to be able to select interpolation method 
(nearest, linear, etc). It is preferred to do this in memory, without 
the need for (temporarily) writing to file.



Best regards from Knut-Frode

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Re: dev-version of HDF4

2011-04-20 Thread Knut-Frode Dagestad
I forgot to say, but I have also tried by specifying the location of 
HDF4, with similar error:


...
checking for SDreaddata in -lmfhdfalt... no
checking for SDreaddata in -lmfhdf... no
checking for SDreaddata in -lhdf4... no
checking for SDreaddata in -lmfhdf... no
checking for SDreaddata in -lmfhdf... no
configure: error: HDF4 support requested with arg 
/Users/knutfd/Software/hdf-4.2.5/, but neither hdf4 nor mfhdf lib found

...

HDF4 library is there, but somehow configure is not able to detect it. 
HDF4 is compiled without SLIB, so that should not be the problem.
I have also tried with older version of HDF4 (4.2r1), other installation 
points, and prebuilt binaries. And I compiled with options 
--disable-fortran --disable-netcdf --with-pic, as suggested on 
http://osgeo-org.1803224.n2.nabble.com/Error-in-GDAL-compiling-with-HDF4-td3351437.html
It has been suggested to use a dev version of HDF4, but I don't know 
if that would solve this problem?


Best regards from Knut-Frode




On 19/04/2011 21:51, Nikolaos Hatzopoulos wrote:

hdf4 libraries not found that's what the log says :)
do a
locate libhdf4
to see where the library is


On Tue, Apr 19, 2011 at 10:24 AM, Knut-Frode Dagestad
knutfrodesop...@hotmail.com mailto:knutfrodesop...@hotmail.com wrote:

I am trying to compile GDAL from source with support for HDF4 with:

./configure --with-hdf4

but get complaints about missing libraries, see extract from config.log:

ld: library not found for -lmfhdfalt
...
ld: library not found for -lmfhdf
...
ld: library not found for -lhdf4
...
ld: library not found for -lmfhdf
configure:19940: checking for SDreaddata in -lmfhdf
configure:19965: gcc -o conftest -g -O2   conftest.c -lmfhdf -ldf
-lsz -ljpeg -lz -lz -lpthread -ldl 5
ld: library not found for -lmfhdf
configure:20009: error: HDF4 support requested with arg yes, but
neither hdf4 nor mfhdf lib found


Similar errors are also reported by others, e.g. this one using
mapserver:
http://osgeo-org.1803224.n2.nabble.com/cannot-find-lmfhdf-td5286909.html

I have tried with various versions of HDF4, both binaries and
compiled from source. The link above (and also
http://trac.osgeo.org/gdal/wiki/BuildingOnUnix) suggests to try a
hdf-dev version. But where can I find such dev-versions, in my
case for Mac OS X 10.6.7?

Would it be possible to update GDAL to work with standard HDF4
libraries?


Best regards from Knut-Frode

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org mailto:gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Re: GDAL drivers written in Python

2011-04-07 Thread Knut-Frode Dagestad

Hi Even,

On 06.04.2011 20:21, Even Rouault wrote:

You can certainly write a (non-GDAL) reader in Python, but of course, you
won't be able to benefit from the GDAL API with the dataset returned by your
reader, or using them in software that use the GDAL API.


But if I use the GDAL API to create a GDAL Dataset and then insert a 
matrix of data (read with non-GDAL code), and further add some 
georeferencing or tie-points, it seems to me that I then can utilize 
GDAL API functionality for e.g. reprojecting, subsetting and exporting, 
just the same way as if it was read with a GDAL-driver?



There are possible workarounds if the base GDAL driver recognizes the imagery
but not the georeferencing. In that case, you can try adding .aux.xml files
(might not work in all cases. really depend if the driver fallbacks to the PAM
mechanism), or wrapping the imagery into a VRT file to add the georeferencing
(will work in all cases at the expense of a small overhead, generally
unnoticeable)


Thank you for useful suggestions, I will look into these features.


Of course, the best for the good of the GDAL community and project would be to
contribute (or contract someone) to improving the existing drivers if they
have defects or need improvements.


I agree this is ideal. Since we must focus on downstream applications 
(geophysical algorithms) we cannot dig too deep into the core driver 
programming. But hopefully it is still manageable, it is anyway good to 
see that Antonio has very similar needs and interests as our group.


Best regards from Knut-Frode

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Re: GDAL drivers written in Python

2011-04-07 Thread Knut-Frode Dagestad

Antonio,

On 06.04.2011 13:09, Antonio Valentino wrote:

I'm not sure to understand. GDAL reads data from the product. If the
product is not geo-referenced you will get data that are not
geo-referenced.

In this case you could use GDAL tools to interpolate data in a
geographic grid but you need to know geographic info and/or parameters
that are stored in the product itself.

In the case of ENVISAT product (but for stripline ones #3160) you
should be able to perform the re-projection e.g. using gdalwarp with
the -tps option (I never tested it personally).


It could be that I do not properly understand the term georeferenced.
For the example of Envisat ASAR, the image data is read perfectly, and 
also GCP's are read relating rows and columns to latitude and longitude 
(though not correct, #3709).
When the datum is given (WGS84) should this product then not be 
georeferenced?


Nevertheless, Envisat is marked as not georeferenced in the table of 
supported raster formats, and reprojecting with gdalwarp does also not 
work for me.



Finally, we would also ideally like to have immediate (pre release)
support for upcoming satellites such as the ESA Sentinel satellites.


This is very interesting.
Can you point me to product specifications of Sentinel data?


I have not looked at this yet, but is seems like the format for 
Sentinel-1 will be Geotiff and XML, so it should then not be a problem 
at all:

http://www.terrafirma.eu.com/Documents/Workshops/WS%206/ESA%20SAR%20Products%20for%20Interferometry.pdf


Best regards from Knut-Frode

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Re: GDAL drivers written in Python

2011-04-07 Thread Knut-Frode Dagestad

Even,

Thank you. That clarifies, and motivates use of the MEM format.
I gradually start to understand that GDAL is generally more 
file-oriented than memory-oriented.


Then I have one last question for this thread:
As mentioned in another post, for some formats (e.g. Envisat) both the 
image data and GCP's are read by GDAL.
What is then the reason why reprojecting with API or gdalwarp does not 
work, and why this (and other) format is classified as not 
georeferenced in the table of supported raster formats?


My theory is that this is because polynomial fit between row/col and 
lon/lat does not work since there are errors (contradictions) in the 
list of GCP's (e.g.: http://trac.osgeo.org/gdal/ticket/3709).

Or is there another explanation?

Best regards from Knut-Frode




On 07.04.2011 13:31, Even Rouault wrote:

Selon Knut-Frode Dagestadknutfrodesop...@hotmail.com:


Hi Even,

On 06.04.2011 20:21, Even Rouault wrote:

You can certainly write a (non-GDAL) reader in Python, but of course, you
won't be able to benefit from the GDAL API with the dataset returned by

your

reader, or using them in software that use the GDAL API.


But if I use the GDAL API to create a GDAL Dataset and then insert a
matrix of data (read with non-GDAL code), and further add some
georeferencing or tie-points, it seems to me that I then can utilize
GDAL API functionality for e.g. reprojecting, subsetting and exporting,
just the same way as if it was read with a GDAL-driver?


Ok, I didn't understand well what you meant before. You could for example create
a MEM dataset ( http://gdal.org/frmt_mem.html ) and feed it with matrix of data
and georeferencing. If your data is too big and cannot fit entirely into RAM,
then you'll have to go with a temporary file, such as geotiff.



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Re: GDAL drivers written in Python

2011-04-06 Thread Knut-Frode Dagestad

Thank you for useful comments, Antonio.

I agree with you that the ideal thing would be to improve the support of 
GDAL for our satellite data formats of interest. It would be great if we 
could combine efforts and get some momentum on this.


Several formats (e.g. TerraSAR-X, Envisat ASAR, MERIS and AATSR) are 
read, but not georeferenced. To my understanding this must be due to 
bugs in the drivers, as listed in some tickets (#3709, #3160)
Since Radarsat-2 is georeferenced, I assume it should also be possible 
to georeference other formats which are not projected, but have only 
many GCPs.
I tried the Alos/Palsar driver, but it did not seem to work at all. The 
drivers are also rather minimalistic, and does e.g. not read the Doppler 
Grid of Envisat ASAR. However, from your answer, I assume it is 
straightforward to read such kind of extra data with supplementary 
Python code. I am also not sure how easy it is to get support and 
geocoding for range-projected Single Look Complex data(?)
Finally, we would also ideally like to have immediate (pre release) 
support for upcoming satellites such as the ESA Sentinel satellites.


We consider a solution to use ESA Beam and NEST Java APIs for additional 
readers. But it is getting a bit messy to mix Python, C and Java in a 
multi-platform environment (Linux, Mac and Windows).


Best regards from Knut-Frode



On 06.04.2011 10:13, Antonio Valentino wrote:

Hi Knut-Frode,

Il giorno Wed, 06 Apr 2011 09:51:42 +0200
Knut-Frode Dagestadknutfrodesop...@hotmail.com  ha scritto:


Hi,

Our group is looking at the possibility to use GDAL with Python
bindings for our work with satellite data. One limitation seems
however to be missing/incomplete drivers, as we have no capacity to
implement new drivers in C.


It would be interesting to know which are the missing formats and the
missing features in existing drivers.
I use GDAL Python bindings to read satellite data of SAR missions and
I'm very interested GDAL to have the best possible support for this
kind of data.
So maybe I could help in some way if you provide some more info.


It seems however to me that the Python API allows to create GDAL
Datasets from data which in principle could be read by any non-GDAL
functions, e.g. PyHDF or self-made Python readers.


Yes, correct


My question is therefore: is there in principle any important
differences or limitations with datasets created this way, compared
to datasets read with native GDAL drivers?

Best regards from Knut-Frode


GDAL dataset are an abstraction of the real datasets on file so, once
created, there is no difference between datasets.

IMHO, also depending on the version of GDAL you, python bindings
could miss some function that is present in the C/C++ API so creating a
new dataset from scratch with the Python API could be a little harder
in some very particular case.

regards




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev