Hi,

we encountered a problem with decimal precision when converting a SAGA grid to ESRI ASCII.

I expect the problem arising from ASCII to double conversion by atof() and/or from the Geotransform (pixel as point to pixel as area).

The SAGA header looks like

POSITION_XMIN   = 12181.8000000000
POSITION_YMIN   = 219184.8800000000
CELLSIZE        = 1.0000000000

The final ESRI ASCII header looks like

xllcorner    12181.299999999999
yllcorner    219184.380000000005

instead of
xllcorner    12181.3
yllcorner    219184.38

How is this problem handled in other drivers (using strtod(), stringstream, ...)?

The relevant lines of code in sagadataset.cpp (http://trac.osgeo.org/gdal/browser/trunk/gdal/frmts/saga/sagadataset.cpp) are:

406 <http://trac.osgeo.org/gdal/browser/trunk/gdal/frmts/saga/sagadataset.cpp#L406> dXmin = atof(papszTokens[1]); 408 <http://trac.osgeo.org/gdal/browser/trunk/gdal/frmts/saga/sagadataset.cpp#L408> dYmin = atof(papszTokens[1]); 410 <http://trac.osgeo.org/gdal/browser/trunk/gdal/frmts/saga/sagadataset.cpp#L410> dCellsize = atof(papszTokens[1]);

601 <http://trac.osgeo.org/gdal/browser/trunk/gdal/frmts/saga/sagadataset.cpp#L601> padfGeoTransform[0] = poGRB->m_Xmin - poGRB->m_Cellsize / 2; 602 <http://trac.osgeo.org/gdal/browser/trunk/gdal/frmts/saga/sagadataset.cpp#L602> padfGeoTransform[3] = poGRB->m_Ymin + (nRasterYSize - 1) * poGRB->m_Cellsize + poGRB->m_Cellsize / 2;



Thanks,
Volker

_______________________________________________
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Reply via email to