#3938 closed defect (fixed)
Bug when using -co COPY_SRC_OVERVIEWS=YES on a multiband source with external overviews
Reported by: | Even Rouault | Owned by: | Even Rouault |
---|---|---|---|
Priority: | normal | Milestone: | 1.8.1 |
Component: | default | Version: | 1.8.0 |
Severity: | normal | Keywords: | |
Cc: |
Description
Reported as http://lists.osgeo.org/pipermail/gdal-dev/2011-January/027583.html
Can be triggered by :
gdal_translate ../autotest/gcore/data/rgbsmall.tif src.tif -outsize 260 260 gdaladdo -ro src.tif 2 gdal_translate src.tif dst.tif -co COPY_SRC_OVERVIEWS=YES
This leads to an overview with invalid data when nBlockYOff > 0
Here's the sequence of what goes wrong :
- the GDALOverviewDS proxy object declared wrongly that the overview dataset was pixel interleaved (but external .tif.ovr are band interleaved) since he requested the base dataset instead of the overview dataset. This is not ideal, but per se, should not lead to wrong results
- but this caused BlockBasedRasterIO() to be used
- which caused the GDALRasterBand::IRasterIO() default implementation to be called on the proxy band
- which results in IReadBlock() to be called on the underlying GTiffRasterBand
--> but at no point, the nBlocksPerRow member of this GTiffRasterBand has been initialized since InitBlockInfo() hasn't been called... So when computing nBlockIdBand0 = nBlockXOff + nBlockYOff * nBlocksPerRow, only the blocks of the top can be fetched !
There are different possible fixes I've tested successfully :
- change BlockBasedRasterIO() to call poBand->IRasterIO() instead of poBand->GDALRasterBand::IRasterIO(). That way the GDALRasterBand::IRasterIO() implementation is called on the underlying band and not on the proxy band, and the InitBlockInfo() will be done on that underlying band. But this may be not a desirable solution for drivers that would redefine IRasterIO() to not be based on block cache.
- call InitBlockInfo() in GTiffRasterBand::IReadBlock(). The core issue could also affect other drivers than the GTiff driver, so not a bullet proof solution
- make sure that GDALProxyRasterBand::IReadBlock() calls InitBlockInfo() on the underlying band before calling IReadBlock() on that band
I've chosen that last one. But one must be aware that there are perhaps drivers that do things similar to what GDALProxyRasterBand::IReadBlock() does...
Change History (2)
comment:1 by , 13 years ago
Resolution: | → fixed |
---|---|
Status: | new → closed |
comment:2 by , 13 years ago
r21866 /trunk/gdal/gcore/gdalproxydataset.cpp: GDALProxyRasterBand::IRasterIO() : don't explicitely call poSrcBand->InitBlockInfo() (partial revert of r21612, #3938)
r21867 /branches/1.8/gdal/gcore/gdalproxydataset.cpp: GDALProxyRasterBand::IRasterIO() : don't explicitely call poSrcBand->InitBlockInfo() (partial revert of r21612, #3938)
--> necessary to avoid useless and memory expensive InitBlockInfo() on full-res band when opening in QGIS a VRT with a huge WMS dataset.
Fixed in trunk (r21612) and branches/1.8 (r21613). Test added in r21614.
(the root causes are also in 1.7 branch but I'm not sure there's the combination of right circumstances in 1.7 that can lead to triggering them)