#6058 closed defect (fixed)
gdal_rasterize doesn't always create the proper sized output
Reported by: | Kyle Shannon | Owned by: | Kyle Shannon |
---|---|---|---|
Priority: | normal | Milestone: | 2.1.0 |
Component: | Utilities | Version: | svn-trunk |
Severity: | normal | Keywords: | rasterize |
Cc: |
Description (last modified by )
When gdal_rasterize creates a dataset, it can miss some rasterization:
kyle@kyle-bsu-workstation:~/Desktop/tmp$ cat in.csv WKT,value "LINESTRING (0 0, 5 5, 10 0, 10 10)",1 kyle@kyle-bsu-workstation:~/Desktop/tmp$ cat in.vrt <OGRVRTDataSource> <OGRVRTLayer name="in"> <SrcDataSource>in.csv</SrcDataSource> <GeometryType>wkbLineString</GeometryType> <GeometryField encoding="wkt" field="WKT"/> </OGRVRTLayer> </OGRVRTDataSource> kyle@kyle-bsu-workstation:~/Desktop/tmp$ gdal_rasterize -burn 1 -init 0 in.csv -tr 1 1 nobuff.tif -at 0...10...20...30...40...50...60...70...80...90...100 - done. kyle@kyle-bsu-workstation:~/Desktop/tmp$ gdal_translate -of aaigrid nobuff.tif nobuff.asc Input file size is 10, 10 0...10...20...30...40...50...60...70...80...90...100 - done. kyle@kyle-bsu-workstation:~/Desktop/tmp$ cat nobuff.asc ncols 10 nrows 10 xllcorner 0.000000000000 yllcorner 0.000000000000 cellsize 1.000000000000 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 1 1 0 0 0 1 0 0 0 1 1 0 0 0 0 0 1 0 1 1 0 0 0 0 0 0 0 1
I would expect the far right column to be burnt. If the input is POINT, the raster is created with a small buffer (+cellsize/2 for each dimension). I think this should be for all geometries.
The proposed patch:
Index: apps/gdal_rasterize.cpp =================================================================== --- apps/gdal_rasterize.cpp (revision 29611) +++ apps/gdal_rasterize.cpp (working copy) @@ -362,8 +362,7 @@ /* When rasterizing point layers and that the bounds have */ /* not been explicitly set, voluntary increase the extent by */ /* a half-pixel size to avoid missing points on the border */ - if (wkbFlatten(OGR_L_GetGeomType(hLayer)) == wkbPoint && - !bTargetAlignedPixels && dfXRes != 0 && dfYRes != 0) + if (!bTargetAlignedPixels && dfXRes != 0 && dfYRes != 0) { sLayerEnvelop.MinX -= dfXRes / 2; sLayerEnvelop.MaxX += dfXRes / 2;
fixes this. It breaks an test by changing some output raster sizes, but that is expected. I have a patch for the autotests too. Looking for some feedback, in case I misses something. The output with the fix (all steps the same):
kyle@kyle-bsu-workstation:~/Desktop/tmp$ cat nobuff.asc ncols 11 nrows 11 xllcorner -0.500000000000 yllcorner -0.500000000000 cellsize 1.000000000000 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 1 1 0 0 0 1 0 0 0 1 1 0 0 1 0 0 1 0 0 1 1 0 0 0 0 1 0 1 0 1 1 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 1
Which I think is expected. I don't think that the larger output will affect anyone too much.
Change History (6)
comment:1 by , 9 years ago
Description: | modified (diff) |
---|
comment:2 by , 9 years ago
comment:3 by , 9 years ago
comment:4 by , 9 years ago
Owner: | changed from | to
---|---|
Status: | new → assigned |
comment:5 by , 9 years ago
Resolution: | → fixed |
---|---|
Status: | assigned → closed |
comment:6 by , 9 years ago
trunk r29618 "Fix test". Fix pyflakes errors (https://travis-ci.org/rouault/gdal_coverage/builds/74421633), and hopefully mingw/mingw-w64 failures due to line-feeds in command line + use of ReadAsArray() that requires numpy which isn't always there
Looks good to me