Opened 13 years ago

Closed 9 years ago

#4189 closed defect (fixed)

FileGDB emits cryptic error message when attempting to write a value longer than the field size

Reported by: Even Rouault Owned by: warmerdam
Priority: normal Milestone:
Component: OGR_SF Version: unspecified
Severity: normal Keywords: FileGDB
Cc: pramsey, aestrada

Description

This is a more focused version of #4186.

Basically, if the field definition of a string field has no determined width, the driver will default to using 256.

If a feature has a value for that field longer than the specified size, the SDK emits the following error : "Error: Failed at writing Row to Table in CreateFeature. (The row contains a bad value.)" which is difficult to diagnose the real cause for the failure.

Should we make a check for that in the driver and emits a clearer error message, or should ESRI do a better job ?

Ah, and if we do the check at our side, we must check the real meaning of the field size for the SDK: is it the number of bytes or the number of UTF-8 characters ?

Or could we create a field without a specified field size ?

Change History (9)

comment:1 by aestrada, 13 years ago

Hi Even,

I think that it would be VERY powerful to have the columns automatically generated based on the size of the source data. This is one of the advantages in using the FGDB to being with. I will ask ESRI what they intend to do about their error messaging and then post my findings back here.

Thanks, Adam

comment:2 by jpalmer, 12 years ago

Hi Even,

Might it just be easier to set the default string length to 2147483647 (max string length defined by FGDB)? That way any OGR string field that has no length can effectively be any length.

This would be really useful for me at the moment as I've been doing testing with PostgreSQL tables with "TEXT" data types and WFS feeds that have fields with much more than 255 characters in them.

Thanks, Jeremy

comment:3 by Even Rouault, 12 years ago

Jeremy, I've followed your advice. Please test and report if it works. I have no access to ArcGIS so I could only test with FileGDB SDK.

r23328 /trunk/gdal/ogr/ogrsf_frmts/filegdb/FGdbUtils.cpp: FileGDB: increase string field width to 2147483647 when its size is not known (#4189)

comment:4 by jpalmer, 12 years ago

Resolution: fixed
Status: newclosed

Wow thanks that was so fast. I've updated to trunk and it fixes the problem.

Thanks again.

comment:5 by aestrada, 12 years ago

Resolution: fixed
Status: closedreopened

All,

I have run across an ugly little problem with this work around. If I translate a TEXT field from Postgresql to FGDB, the 2 billion character size limit forces ArcGIS to crash without warning when performing an attribute query against it. I can definitely cast the field to a VARCHAR or CHAR but its hard for me to get the max column width for my dynamic data set. Any thoughts on how to deal with this?

I am now on Windows 64bit using GDAL trunk.

Adam

comment:6 by jpalmer, 12 years ago

I believe that ArcGIS 10.1 fixes the issue with querying and sorting large FGDB text fields.

comment:7 by Even Rouault, 12 years ago

Adam,

perhaps 65536 might be a more reasonable limit ? I think it would cover 99.99% of the cases, so if you can experiment this change successfully, this might be a good default value.

comment:8 by Jukka Rahkonen, 9 years ago

Adam, did you test the 65536 limit?

comment:9 by Even Rouault, 9 years ago

Resolution: fixed
Status: reopenedclosed

The default width is now 65536. As we haven't heard vocal protests recently, I guess it works OK. Closing

Note: See TracTickets for help on using tickets.