We are storing large raster datasets (~500,000 x 500,000) in geotiff files.
The array data is typically very sparse (density is 0.02% or less) and
compression greatly reduces the geotiff size.

However, when processing data we read the data in chunks, which are
automatically decompressed..this means we are often reading in large areas
of no data and 'skipping' them. 
Is there a way to only read data with values so we may apply the processing
to these cells, and write them back out, skipping the no data? I guess this
would be equivalent to just not decompressing?



--
View this message in context: 
http://osgeo-org.1560.x6.nabble.com/Read-data-without-reading-null-data-tp5251626.html
Sent from the GDAL - Dev mailing list archive at Nabble.com.
_______________________________________________
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Reply via email to