Opened 7 years ago
Closed 7 years ago
#211 closed defect (fixed)
NTv2 calculations incorrect with German BWTA2017.gsb file ... Element count results in RowSize greater than unsigned short
Reported by: | ar532 | Owned by: | Norm Olsen |
---|---|---|---|
Priority: | major | Milestone: | |
Component: | Library | Version: | |
Keywords: | Cc: |
Description
The file BWTA2017.gsb (around 392 Megabytes!)just released contains 4369 elements per row. Since the RowSize member of the csNTv2SubGrid_ struct is an unsigned short the calculations of the RowSize overflow over the 65536 limit (actual value would have been 69904 but result is 4368). The member should be changed to an unsigned long instead and the line: subPtr->RowSize = (unsigned short)(subPtr->ElementCount * thisPtr->RecSize); be changed to reflect this.
In addition to this, during debug I noticed that the line:
malcCnt = sizeof (struct csNTv2SubHdr_) * (ulong32_t)thisPtr->SubCount;
should be
malcCnt = sizeof(struct csNTv2SubGrid_) * (ulong32_t)thisPtr->SubCount;
in CSinitNTv2(). This never resulted in memory fault as the SubHdr struct is larger than the SubGrid but too much memory is allocated.
Finally in CScalcNTv2() I noticed that DeltaLng is used to add to a latitude at the line:
nwCell [LAT] = seCell [LAT] + cvtPtr->DeltaLng;
My guess is that it should have been
nwCell [LAT] = seCell [LAT] + cvtPtr->DeltaLat;
This probably introduces only a small error but it is probably worth fixing.
Change History (4)
comment:1 by , 7 years ago
Owner: | changed from | to
---|---|
Status: | new → assigned |
comment:2 by , 7 years ago
Oops! I have visited recently (one year ago) an my memory was correct. The entire file has been read in sin circa 2013. Comments please: DO we want to load the entirety of a 392 megabyte file into memory?
comment:3 by , 7 years ago
Hi Norm,
The file is indeed loaded in memory. It takes about a second on a good computer and our apps are mostly 64 bits now so we could live with loading 400 megabytes but I have just heard from our German representative and Bavaria is planning a NTv2 file with a 1” density resulting in a file size of about 3.3 GB. That would be too much for just about anyone. I guess a solution to the file loading is desirable but it could be a simple single block caching. For large file the performance could be bad but what a user relying on a 3 gigabyte file could expect any better than poor performance.
Alain
comment:4 by , 7 years ago
Resolution: | → fixed |
---|---|
Status: | assigned → closed |
Corrected as suggested in the original ticket. The submission produced Revision 2806.
Thanks for reporting this problem, and thanks for your debugging efforts. These issues will be corrected, most likely as suggested in the defect report.
Would like an opinion from the reporter. If my memory serves (it's been a decade since I have visited this code), the entire NTv2 file is read into memory and is memory resident during calculations; probably for performance purposes, also possibly for thread safety purposes. Does the idea of reading a 392MB byte data file into memory concern you? WHat's you take on this?
Please advise.