Opened 21 years ago
Last modified 21 years ago
#389 closed defect (wontfix)
HFA Error Extracting SkipFactor
Reported by: | warmerdam | Owned by: | warmerdam |
---|---|---|---|
Priority: | high | Milestone: | |
Component: | GDAL_Raster | Version: | unspecified |
Severity: | normal | Keywords: | |
Cc: | shepherdj@… |
Description
Hi Frank I have only just started to use your hfa interface to access ERDAS Imagine files (instead of ERDAS' c-toolkit). One of the programs I use quite a bit is a command line version of Imagines imageinfo gui. This program gets most of the information you are likely to need to know about a raster file and prints it to the terminal (ie. don't need to use an IMAGINE licence to find out simple file info). The output of this program (using your library) looks something like this... ---------------------------------------------------------------------------- cosi.img(:cosi) Pyramid Layers: Present Compression: Run-Length Lines: 14293 Pixels: 15017 Layer: Continuous Data: Float 32-bit Min: -0.527537 Max: 1 Mean: 0.608896 Median: 0.618116 Mode: 0.618116 Std Dev: 0.103194 Skip Factor X: 1 Skip Factor Y: 1 Upper Left X: 2113902.500 Upper Left Y: 5568422.500 Lower Right X: 2339142.500 Lower Right Y: 5354042.500 Pixel Size X: 15.000 Pixel Size Y: 15.000 Unit: meters Projection: New Zealand Map Grid ---------------------------------------------------------------------------- I managed to use your hfa library to retrieve all the information that I needed but the statistic skip factors were incorrect if zero was excluded when the statistics were calculated. I checked the calls as far as I could but a call like psStatistics->sfx = poStatisticsParameters->GetIntField("SkipFactorX" ); gives incorrect information when zero excluded in stats. I wondered if this could be a bug in the way the hfa structure was read, so I tried using your program hfatest to do a tree dump (option -dt) of the same raster file with stats calculated including and excluding zero. Sure enough the node output of the dump did not match the node data as given in ERDAS' hfaview gui when zero's were excluded Below is the dump of the HistogramParameters node with zero included and stats set at skip factor 1 (correct output) HistogramParameters(Eimg_StatisticsParameters830) 70 @ 1402 + LayerNames = (no values) + ExcludedValues = (no values) + AOIname = + string = (no values) + SkipFactorX = 1 + SkipFactorY = 1 + BinFunction = + numBins = 256 + binFunctionType = linear + minLimit = -0.527537 + maxLimit = 1.000000 + binLimits = (no values) Below is the dump of the HistogramParameters node with zero excluded and stats set at skip factor 1 (incorrect output) HistogramParameters(Eimg_StatisticsParameters830) 90 @ 507573279 + LayerNames = (no values) + ExcludedValues = (basedata) + AOIname = + string = ` ' + SkipFactorX = 256 + SkipFactorY = 0 + BinFunction = (no values) I don't have the C++ knowledge to track this down and was hoping you could trace the cause of the problem. I tried this test on both Solaris and Linux so I don't think it is to do with platforms. Regards James Shepherd
Change History (7)
comment:2 by , 21 years ago
Actually, I have found another sample file I received just yesterday that does have the problem. I will dig into it further. No sample file required.
comment:3 by , 21 years ago
James, I have come to the conclusion that the file is corrupt. This node: StatisticsParameters(Eimg_StatisticsParameters830) @ 3423 + 60 @ 3551 + LayerNames = (no values) + ExcludedValues = (basedata) + AOIname = + string = ` ' + SkipFactorX = 256 + SkipFactorY = 0 + BinFunction = (no values) looks like this in a corrupt file: 4167: 00000000 E8BEA001 01000000 F0BEA001 ~~~~~~~~~~~~~~~~ 4183: 01000000 01000000 0A000100 00000000 ~~~~~~~~~~~~~~~~ 4199: 00000000 00000000 0CBFA001 1F000000 ~~~~~~~~~~~~~~~~ 4215: 1F000000 01000000 1CBFA001 00010000 ~~~~~~~~~~~~~~~~ 4231: 00000000 00000000 00000000 000000E0 ~~~~~~~~~~~~~~~~ But a similar OK node in another file: StatisticsParameters(Eimg_StatisticsParameters830) @ 670 + 40 @ 798 + LayerNames = (no values) + ExcludedValues = (no values) + AOIname = + string = (no values) + SkipFactorX = 2 + SkipFactorY = 2 + BinFunction = (no values) Looks like this: 798: 00000000 B8F00D10 00000000 C0F00D10 ~~~~~~~~~~~~~~~~ 814: 00000000 C8F00D10 02000000 02000000 ~~~~~~~~~~~~~~~~ 830: 00000000 D8F00D10 9E020000 FA040000 ~~~~~~~~~~~~~~~~ 846: D2000000 CE030000 C6030000 04000000 ~~~~~~~~~~~~~~~~ 862: 44657363 72697074 6F725F54 61626C65 Descriptor_Table The good example has 3 8 byte pointer definitions (size and offset) for the LayerNames, ExcludedValues and AOIName fields followed by the SkipFactorX and SkipFactorY values. The bad example only have 2 8 byte pointer definitions before the skipfactors start. As far as I can tell generating these files is a bug in imagine, and they have somehow patched Imagine to fix things up after the fact not based on their general data definition logic. I have considered doing some sort of hack, but without a better understanding of how to identify the situation I think it would be more dangerous than doing nothing. So - in short - I would like to not do anything. You may need to just grab the data offset of the statistics info and try to hack out the information for yourself if you really need to access this information.
comment:4 by , 21 years ago
Hi Frank I have just read your notes on Bugzilla, By "corrupt" do you mean whenever values are excluded in the histogram IMAGINE corrupts its own format? (and then fixes it again if you recalculate including zero) It seems pretty strange for the ERDAS hfaview tool to work fine in both cases because of a specific hack!. I will send you a pair of small identical files next week with stats calculated either way to confirm that one type is corrupt. James
comment:5 by , 21 years ago
James, I would encourage you to register with bugzilla and then add yourself to the "cc" list for the bug report. Then I can just append messages to the bug report and you will see a copy. Having email discussions about a bug outside bugzilla is problematic since I either have to add all messages manually, or lose some of the bug information. I am not convinced that the node is corrupted based on whether values are excluded. My working assumption has been that this relates to a particular version of Imagine though I have no strong basis for that. It is theoretically possible that a whole field from the structure can just not be written to the file under some circumstances (such as there being no data for it), but my understanding was that in this situation it would write a 4 byte "0", and then a 4 byte file offset (which is not important) even for an empty pointer field. If you can provide files produced by the same version of Imagine, one showing this problem, and one not showing this problem, that would be helpful. As for whether imagine would have a specific hack, I think it depends on whether Imagine actually uses the data dictionary information _allways_ when reading and writing node information. I suspect, but don't know, that this is not always the case. That is, that special logic is sometimes provided for some kinds of nodes, and thus if the code gets out of sync with the "data definition" in the file that the file would be corrupt as far as any non-imagine application that depends on the data definitions would be concerned. Of course, it could also easiliy be that I am just not interpreting some of the rules of how structures are stored in the file properly. If we can come to that conclusion then there should be a bug I can fix. Best regards,
comment:6 by , 21 years ago
Reopening to give James a chance to convince me there is something that is practically fixable here.
comment:7 by , 21 years ago
I have sent Frank a couple of trial images, cosi_a.img and cosi_b.img, they are identical in terms of image content, the only difference is the 'b' image had its statistics calculated ignoring zero. The IMAGINE version used was 8.4 The 'b' image exhibits the statistics skip factor hfa reading problem as discussed earlier, while the 'a' image works fine.
Note:
See TracTickets
for help on using tickets.