Opened 13 years ago
Closed 9 years ago
#3982 closed defect (fixed)
[PATCH] In .dbf files, large values in numeric fields with zero precision are corrupted.
Reported by: | chaitanya | Owned by: | chaitanya |
---|---|---|---|
Priority: | normal | Milestone: | 1.8.1 |
Component: | OGR_SF | Version: | unspecified |
Severity: | normal | Keywords: | shapefile, dbf |
Cc: | warmerdam |
Description
Bug found by Anita Graser. http://lists.osgeo.org/pipermail/gdal-dev/2011-February/027892.html
dbfopen.c is treating all real values with zero precision as integers. Values longer than ten digits are lost this way.
Attachments (2)
Change History (7)
comment:1 by , 13 years ago
Cc: | added |
---|
comment:3 by , 13 years ago
Summary: | In .dbf files, large values in numeric fields with zero precision are corrupted. → [PATCH] In .dbf files, large values in numeric fields with zero precision are corrupted. |
---|
Frank,
attached a patch to dbfopen.c to fix the issue and the autotest that check it.
In fact, I'm wondering if the "if( psDBF->panFieldDecimals[iField] == 0 && .... )" has any advantage over using the else part in all cases
by , 13 years ago
Attachment: | dbfopen_patch_for_3982.patch added |
---|
by , 13 years ago
Attachment: | test_for_3982.patch added |
---|
Note:
See TracTickets
for help on using tickets.
This is a bit surprising. A field with more than 10 characters should be handled as a double according to DBFGetFieldInfo()
The issue is that an "integer" of more than 10 digits cannot be represented into a double with exact precision (thus Int64 support might help a bit), but that doesn't seem to be the issue reported by the user here.