Opened 13 years ago

Closed 11 years ago

Last modified 11 years ago

#4160 closed enhancement (fixed)

add support for georreference and geotransform info from COSMO-SKYMED files to HDF5 driver

Reported by: alexmantaut Owned by: alexmantaut
Priority: normal Milestone: 1.10.1
Component: GDAL_Raster Version: unspecified
Severity: normal Keywords: hdf5 COSMO SKYMED georreference
Cc: warmerdam, antonio, Markus Neteler

Description

Hi all,

I've created a patch wich allows GDAL to interpretate the georreference and geotransform info from COSMO-SKYMED files.

To know more about the COSMO SkyMed products:

COSMO SkyMed handbook: http://www.e-geos.it/products/pdf/csk-product%20handbook.pdf

This patch depends on solving ticket:4121, I posted a patch on that ticket that solves the issue...

Every feedback will be welcomed, I hope this patch is usefull to the community.

regards Alex

Attachments (10)

csk_hdf5.patch (26.5 KB ) - added by alexmantaut 13 years ago.
Patch to get georreference and geotransform into (1st version)
hdf5_csk_111011.patch (34.7 KB ) - added by alexmantaut 13 years ago.
hdf5_csk_131011.patch (223.1 KB ) - added by alexmantaut 13 years ago.
hdf5_csk_trunk_r23238.patch (34.7 KB ) - added by antonio 13 years ago.
The same patch submitted by Alex but ported to trunk
CSKTest.rar (3.6 KB ) - added by alexmantaut 13 years ago.
hdf5_csk_test_trunk_r23269.patch (7.9 KB ) - added by antonio 13 years ago.
Test code for CSK specific functions
CSK_DGM.h5 (11.5 KB ) - added by antonio 13 years ago.
CSK_GEC.h5 (12.3 KB ) - added by antonio 13 years ago.
hdf5_csk_trunk_r23238.2.patch (34.7 KB ) - added by alexmantaut 13 years ago.
Small fix to the trunk patch
hdf5_csk_trunk_r23238.3.patch (30.6 KB ) - added by alexmantaut 12 years ago.
new version of the patch, spatial reference and geotransform issues solved.

Download all attachments as: .zip

Change History (44)

by alexmantaut, 13 years ago

Attachment: csk_hdf5.patch added

Patch to get georreference and geotransform into (1st version)

comment:1 by Even Rouault, 13 years ago

Summary: COSMO-SKYMED Georreference support[PATCH] COSMO-SKYMED Georreference support

comment:2 by warmerdam, 13 years ago

Cc: antonio added

comment:3 by antonio, 13 years ago

Alex, a couple of notes on your patch:

  • you say that the patch depends on attachment:hdf5dataset-metadata.patch:ticket:4121 but, if I remember well, all metadata related to an image should be associated to the "HDF5 dataset" itself or to its ancestors. Can you please confirm?
  • you use the GDAL API to access metadata but IMHO it would be better to use HDF5 API. In this way you should be able to avoid back and forth conversion to string and also avoid string parsing (you can get vector attributes directly)
  • if you use HDF5 API for metadata access you should be able to retrieve metadata from the entire hierarchy (decoupling from #4121 if ever it is a problem)
  • when you set the geotransform you should check the projection ID instead of relying on "product type". In this way you should be also able to handle higher level products (co-registered, interferometric, DEM, etc)
  • if the product is not in UTM/UPS projection probably we could load GCP info that are always present at any processing level (including RAW)

comment:4 by alexmantaut, 13 years ago

Status: newassigned

Antonio:

  • In order to build the geotransform matrix, I should get the image's dimensions and the coordinates of its corners (i.e.:Top Left Geodetic Coordinates, etc.) wich are asociated to the image in particular.
  • I used the GDAL API to get the metadata to avoid having to deal with the HDF5 API directly (wich IMHO will be duplicating code). But of course using the HDF5 API will have less overhead... You're right that using the HDF5 API will decouple this patch from #4121, so I will change it. The thing is: I'm not sure on exactly what operations do I need to perform in order to retrieve and format the metadata via the HDF5 API, can you provide an example?
  • When you say check the projection Id instead of the product type, you mean change:
    if(iProductType == PROD_CSK_L1C||iProductType == PROD_CSK_L1D) 
    {
    //...build geotransform
    }
    to something like:
    if(projectionId == UTM || projectionId == UPS ) 
    {
    //...build geotransform
    }
    

I agree that is a better option than checking for the product level, I will change it.

  • In the issue with the GCP, I didn't took that into consideration... To be honest, I'm not sure on how to load the GCPs from the file...

in reply to:  4 ; comment:5 by antonio, 13 years ago

Replying to alexmantaut:

Antonio:

  • In order to build the geotransform matrix, I should get the image's dimensions and the coordinates of its corners (i.e.:Top Left Geodetic Coordinates, etc.) wich are asociated to the image in particular.

IMHO the geotransform should be set only if projection is UTM or UPS. In this case you just need the "Top Left East-North" and line/column spacing

  • I used the GDAL API to get the metadata to avoid having to deal with the HDF5 API directly (wich IMHO will be duplicating code). But of course using the HDF5 API will have less overhead... You're right that using the HDF5 API will decouple this patch from #4121, so I will change it. The thing is: I'm not sure on exactly what operations do I need to perform in order to retrieve and format the metadata via the HDF5 API, can you provide an example?

An example of code for reading HDF5 attributes can be found at

source:trunk/gdal/frmts/hdf5/hdf5dataset.cpp@22683#L667

  • When you say check the projection Id instead of the product type, you mean change:
    if(iProductType == PROD_CSK_L1C||iProductType == PROD_CSK_L1D) 
    {
    //...build geotransform
    }
    to something like:
    if(projectionId == UTM || projectionId == UPS ) 
    {
    //...build geotransform
    }
    

yes, exactly

I agree that is a better option than checking for the product level, I will change it.

  • In the issue with the GCP, I didn't took that into consideration... To be honest, I'm not sure on how to load the GCPs from the file...

All CSK products have at least

  • "Bottom Left Geodetic Coordinates"
  • "Top Left Geodetic Coordinates"
  • "Bottom Right Geodetic Coordinates"
  • "Top Right Geodetic Coordinates"
  • and also "Scene Centre Geodetic Coordinates"

attached to the image (SBI/MBI)

GCPs IMHO should be used in all cases in which projection is not UTM or UPS.

in reply to:  5 ; comment:6 by alexmantaut, 13 years ago

Replying to antonio:

Replying to alexmantaut:

Antonio:

  • In order to build the geotransform matrix, I should get the image's dimensions and the coordinates of its corners (i.e.:Top Left Geodetic Coordinates, etc.) wich are asociated to the image in particular.

IMHO the geotransform should be set only if projection is UTM or UPS. In this case you just need the "Top Left East-North" and line/column spacing

Yes, the geotransform should set only if the projection is UTM or UPS. In order to build the geotransform I used:

  • Top Left East-North (from the image subdataset)
  • Bottom Right East-North (from the image subdataset)
  • Image Length and width, in order to get the Bottom right image coordinates (from the image subdataset)

Is there any difference between using those parameters and using "Top Left East-North" and line/column spacing? I think it should be the same...

  • I used the GDAL API to get the metadata to avoid having to deal with the HDF5 API directly (wich IMHO will be duplicating code). But of course using the HDF5 API will have less overhead... You're right that using the HDF5 API will decouple this patch from #4121, so I will change it. The thing is: I'm not sure on exactly what operations do I need to perform in order to retrieve and format the metadata via the HDF5 API, can you provide an example?

An example of code for reading HDF5 attributes can be found at

source:trunk/gdal/frmts/hdf5/hdf5dataset.cpp@22683#L667

Ok, I will check that, if posible I will like to build a method that fetches an hdf5 field with a given attribute name in order to use it on the rest of the code...

  • When you say check the projection Id instead of the product type, you mean change:
    if(iProductType == PROD_CSK_L1C||iProductType == PROD_CSK_L1D) 
    {
    //...build geotransform
    }
    to something like:
    if(projectionId == UTM || projectionId == UPS ) 
    {
    //...build geotransform
    }
    

yes, exactly

I agree that is a better option than checking for the product level, I will change it.

  • In the issue with the GCP, I didn't took that into consideration... To be honest, I'm not sure on how to load the GCPs from the file...

All CSK products have at least

  • "Bottom Left Geodetic Coordinates"
  • "Top Left Geodetic Coordinates"
  • "Bottom Right Geodetic Coordinates"
  • "Top Right Geodetic Coordinates"
  • and also "Scene Centre Geodetic Coordinates"

attached to the image (SBI/MBI)

GCPs IMHO should be used in all cases in which projection is not UTM or UPS.

As with the GCPs, I didn't realized that the image's corners where GCPs :P...

But as explained above I used them in order to build the geotransform matrix, so it shouldn't be a problem...

For what I've read on the CSK handbook the only projected types of image will be UTM or UPS (for L1C and L1D products) In the case of L1B, L1A and L0 products I think it isn't possible to build a geotransform matrix, because those images aren't reprojected (I'm not sure if it is possible to build a geotransform matrix for those images because there is a big difference on the scale in different parts of the image...

in reply to:  6 ; comment:7 by antonio, 13 years ago

Replying to alexmantaut:

Replying to antonio:

Replying to alexmantaut:

Antonio:

[CUT]

IMHO the geotransform should be set only if projection is UTM or UPS. In this case you just need the "Top Left East-North" and line/column spacing

Yes, the geotransform should set only if the projection is UTM or UPS. In order to build the geotransform I used:

  • Top Left East-North (from the image subdataset)
  • Bottom Right East-North (from the image subdataset)
  • Image Length and width, in order to get the Bottom right image coordinates (from the image subdataset)

Is there any difference between using those parameters and using "Top Left East-North" and line/column spacing? I think it should be the same...

Yes it should be the same, just using line/column spacing you need no computation.

[CUT]

An example of code for reading HDF5 attributes can be found at

source:trunk/gdal/frmts/hdf5/hdf5dataset.cpp@22683#L667

Ok, I will check that, if posible I will like to build a method that fetches an hdf5 field with a given attribute name in order to use it on the rest of the code...

Good idea. Probably you will need only one function/method for getting attributes with floating point data type.

[CUT]

  • In the issue with the GCP, I didn't took that into consideration... To be honest, I'm not sure on how to load the GCPs from the file...

All CSK products have at least

  • "Bottom Left Geodetic Coordinates"
  • "Top Left Geodetic Coordinates"
  • "Bottom Right Geodetic Coordinates"
  • "Top Right Geodetic Coordinates"
  • and also "Scene Centre Geodetic Coordinates"

attached to the image (SBI/MBI)

GCPs IMHO should be used in all cases in which projection is not UTM or UPS.

As with the GCPs, I didn't realized that the image's corners where GCPs :P...

But as explained above I used them in order to build the geotransform matrix, so it shouldn't be a problem...

Please note that "Top Left East-North" is not the same as "Top Left Geodetic Coordinates". The latter is lat and long coordinates (WGS84)

For what I've read on the CSK handbook the only projected types of image will be UTM or UPS (for L1C and L1D products) In the case of L1B, L1A and L0 products I think it isn't possible to build a geotransform matrix, because those images aren't reprojected (I'm not sure if it is possible to build a geotransform matrix for those images because there is a big difference on the scale in different parts of the image...

For L1B products (ground projection) the geotransform that one can compute from corners should not be too bed (at least for HIMAGE and ENHANCED SPOTLIGHT products).

For products that are in slant range projection (L0 and L1A) computing the geotransform from GDPs makes not too much sense IMHO.

All in all I think that only GCPs should be provided for L1B, L1A and L0 products. The user can always compute a coarse geotransform by itself using e.g. GDALGCPsToGeoTransform if needed.

in reply to:  7 comment:8 by alexmantaut, 13 years ago

Replying to antonio:

Replying to alexmantaut:

Replying to antonio:

Replying to alexmantaut:

Antonio:

[CUT]

IMHO the geotransform should be set only if projection is UTM or UPS. In this case you just need the "Top Left East-North" and line/column spacing

Yes, the geotransform should set only if the projection is UTM or UPS. In order to build the geotransform I used:

  • Top Left East-North (from the image subdataset)
  • Bottom Right East-North (from the image subdataset)
  • Image Length and width, in order to get the Bottom right image coordinates (from the image subdataset)

Is there any difference between using those parameters and using "Top Left East-North" and line/column spacing? I think it should be the same...

Yes it should be the same, just using line/column spacing you need no computation.

[CUT]

An example of code for reading HDF5 attributes can be found at

source:trunk/gdal/frmts/hdf5/hdf5dataset.cpp@22683#L667

Ok, I will check that, if posible I will like to build a method that fetches an hdf5 field with a given attribute name in order to use it on the rest of the code...

Good idea. Probably you will need only one function/method for getting attributes with floating point data type.

Mmm, I began to implement that method and I realized that the metadata items may be arrays, and also it would be a problem trying to retrieve metadata in a type diferent than float (without repeating code) Anyway, to make the method more usefull I thought this:

MetadataItemContainer *GetBinaryMetadataItem(char *pszItemName);

class MetadataItemContainer { private: int nItems; Number of items on this attribute char * pBinary; Binary stream (not real chars) int dataType; Number that represents the type of item

public:

int GetNumberItems(); returns number of elements in the array int GetDataType() Returns a number with the type of item (float, int, string) float toFloat(int nItem); convert to float the nth element from the item int toInt(int nItem); convert to itemthe nth element from the item ... }

I think this way to get the metadata would avoid string conversions (it would only work with binaries), would be general enough to be used with any variable type, and the overhead asociated with fetching the item would be small... What do you think? May I implement it this way?

[CUT]

  • In the issue with the GCP, I didn't took that into consideration... To be honest, I'm not sure on how to load the GCPs from the file...

All CSK products have at least

  • "Bottom Left Geodetic Coordinates"
  • "Top Left Geodetic Coordinates"
  • "Bottom Right Geodetic Coordinates"
  • "Top Right Geodetic Coordinates"
  • and also "Scene Centre Geodetic Coordinates"

attached to the image (SBI/MBI)

GCPs IMHO should be used in all cases in which projection is not UTM or UPS.

As with the GCPs, I didn't realized that the image's corners where GCPs :P...

But as explained above I used them in order to build the geotransform matrix, so it shouldn't be a problem...

Please note that "Top Left East-North" is not the same as "Top Left Geodetic Coordinates". The latter is lat and long coordinates (WGS84)

All right, I didn't noticed but I was confusing "Top Left East-North" with "Top Left Geodetic Coordinates" :P so in order to get the GCPs I should use Geodetic Coordinates, right?

For what I've read on the CSK handbook the only projected types of image will be UTM or UPS (for L1C and L1D products) In the case of L1B, L1A and L0 products I think it isn't possible to build a geotransform matrix, because those images aren't reprojected (I'm not sure if it is possible to build a geotransform matrix for those images because there is a big difference on the scale in different parts of the image...

For L1B products (ground projection) the geotransform that one can compute from corners should not be too bed (at least for HIMAGE and ENHANCED SPOTLIGHT products).

For products that are in slant range projection (L0 and L1A) computing the geotransform from GDPs makes not too much sense IMHO.

All in all I think that only GCPs should be provided for L1B, L1A and L0 products. The user can always compute a coarse geotransform by itself using e.g. GDALGCPsToGeoTransform if needed.

Mmm, I'm not sure if I understood the last part properly...

The conclusion would be:

  • I should provide the GCPs for all the products (although they will not be necesary for L1C and L1D products) ?
  • I will not provide a geotransform for L1B, L1A and L0, although the user will be able to create it's own transformation using GDALGCPsToGeoTransform? Or should I provide a geotransform for HIMAGE and ENHANCED SPOTLIGHT products?

comment:9 by antonio, 13 years ago

Replying to alexmantaut:

Replying to antonio:

Replying to alexmantaut:

[CUT]

Ok, I will check that, if posible I will like to build a method that fetches an hdf5 field with a given attribute name in order to use it on the rest of the code...

Good idea. Probably you will need only one function/method for getting attributes with floating point data type.

Mmm, I began to implement that method and I realized that the metadata items may be arrays, and also it would be a problem trying to retrieve metadata in a type diferent than float (without repeating code) Anyway, to make the method more usefull I thought this:

MetadataItemContainer *GetBinaryMetadataItem(char *pszItemName);

class MetadataItemContainer { private: int nItems; Number of items on this attribute char * pBinary; Binary stream (not real chars) int dataType; Number that represents the type of item

public:

int GetNumberItems(); returns number of elements in the array int GetDataType() Returns a number with the type of item (float, int, string) float toFloat(int nItem); convert to float the nth element from the item int toInt(int nItem); convert to itemthe nth element from the item ... }

I think this way to get the metadata would avoid string conversions (it would only work with binaries), would be general enough to be used with any variable type, and the overhead asociated with fetching the item would be small... What do you think? May I implement it this way?

Alex, I think you solution is very elegant but probably something simpler is more appropriate in this case. Also consider you only need to read attributes with float data type (please confirm).

A solution like the following is probably sufficient in this case:

herr_t HDF5ReadDoubleAttr(hid_t hH5ObjID, const char* szAttrName, int nLen, double *pdfValues) 

Read at most Len elements of an attribute name szAttrName with double data type and store them in a preallocated array pdfValues. In case of errors the return value is set properly.

This is by far less general than your solution but IMHO it is also by far simpler and appropriate to the context.

In case we actually need a more general solution IMHO we should move toward something that is more similar to the HDF5 C++ API or, in extreme instance, to use the HDF5 C++ API itself.

GCPs IMHO should be used in all cases in which projection is not UTM or UPS.

As with the GCPs, I didn't realized that the image's corners where GCPs :P...

But as explained above I used them in order to build the geotransform matrix, so it shouldn't be a problem...

Please note that "Top Left East-North" is not the same as "Top Left Geodetic Coordinates". The latter is lat and long coordinates (WGS84)

All right, I didn't noticed but I was confusing "Top Left East-North" with "Top Left Geodetic Coordinates" :P so in order to get the GCPs I should use Geodetic Coordinates, right?

Yes

[CUT]

All in all I think that only GCPs should be provided for L1B, L1A and L0 products. The user can always compute a coarse geotransform by itself using e.g. GDALGCPsToGeoTransform if needed.

Mmm, I'm not sure if I understood the last part properly...

The conclusion would be:

  • I should provide the GCPs for all the products (although they will not be necesary for L1C and L1D products) ?
  • I will not provide a geotransform for L1B, L1A and L0, although the user will be able to create it's own transformation using GDALGCPsToGeoTransform? Or should I provide a geotransform for HIMAGE and ENHANCED SPOTLIGHT products?

IMHO

  • provide only geotransform if the product in in UTM or UPS projection (e.g. L1C and L1d products). No GCPs.
  • provide only GCPs in all other cases (including L0, L1A and L1B). No geotransform.

In the second case the user can alway compute an approx geotransform by itself using GCPs.

Do you think it is an acceptable solution?

in reply to:  9 ; comment:10 by alexmantaut, 13 years ago

Replying to antonio:

Replying to alexmantaut:

Replying to antonio:

Replying to alexmantaut:

[CUT]

Ok, I will check that, if posible I will like to build a method that fetches an hdf5 field with a given attribute name in order to use it on the rest of the code...

Good idea. Probably you will need only one function/method for getting attributes with floating point data type.

Mmm, I began to implement that method and I realized that the metadata items may be arrays, and also it would be a problem trying to retrieve metadata in a type diferent than float (without repeating code) Anyway, to make the method more usefull I thought this:

MetadataItemContainer *GetBinaryMetadataItem(char *pszItemName);

class MetadataItemContainer { private: int nItems; Number of items on this attribute char * pBinary; Binary stream (not real chars) int dataType; Number that represents the type of item

public:

int GetNumberItems(); returns number of elements in the array int GetDataType() Returns a number with the type of item (float, int, string) float toFloat(int nItem); convert to float the nth element from the item int toInt(int nItem); convert to itemthe nth element from the item ... }

I think this way to get the metadata would avoid string conversions (it would only work with binaries), would be general enough to be used with any variable type, and the overhead asociated with fetching the item would be small... What do you think? May I implement it this way?

Alex, I think you solution is very elegant but probably something simpler is more appropriate in this case. Also consider you only need to read attributes with float data type (please confirm).

Yes, allmost all the attriubutes I must read are float, the exception is the projection type, wich is string, but it is associated to the root of the dataset, so it should not be a problem.

A solution like the following is probably sufficient in this case:

herr_t HDF5ReadDoubleAttr(hid_t hH5ObjID, const char* szAttrName, int nLen, double *pdfValues) 

Read at most Len elements of an attribute name szAttrName with double data type and store them in a preallocated array pdfValues. In case of errors the return value is set properly.

This is by far less general than your solution but IMHO it is also by far simpler and appropriate to the context.

All right I will implement something similar to the protoype you mentioned.

In case we actually need a more general solution IMHO we should move toward something that is more similar to the HDF5 C++ API or, in extreme instance, to use the HDF5 C++ API itself.

GCPs IMHO should be used in all cases in which projection is not UTM or UPS.

As with the GCPs, I didn't realized that the image's corners where GCPs :P...

But as explained above I used them in order to build the geotransform matrix, so it shouldn't be a problem...

Please note that "Top Left East-North" is not the same as "Top Left Geodetic Coordinates". The latter is lat and long coordinates (WGS84)

All right, I didn't noticed but I was confusing "Top Left East-North" with "Top Left Geodetic Coordinates" :P so in order to get the GCPs I should use Geodetic Coordinates, right?

Yes

[CUT]

All in all I think that only GCPs should be provided for L1B, L1A and L0 products. The user can always compute a coarse geotransform by itself using e.g. GDALGCPsToGeoTransform if needed.

Mmm, I'm not sure if I understood the last part properly...

The conclusion would be:

  • I should provide the GCPs for all the products (although they will not be necesary for L1C and L1D products) ?
  • I will not provide a geotransform for L1B, L1A and L0, although the user will be able to create it's own transformation using GDALGCPsToGeoTransform? Or should I provide a geotransform for HIMAGE and ENHANCED SPOTLIGHT products?

IMHO

  • provide only geotransform if the product in in UTM or UPS projection (e.g. L1C and L1d products). No GCPs.
  • provide only GCPs in all other cases (including L0, L1A and L1B). No geotransform.

In the second case the user can alway compute an approx geotransform by itself using GCPs.

Do you think it is an acceptable solution?

Mmm, I totally agree with you on the matter of the Geotransform, but I don't see why I should not provide the GCPs for L1C and L1D, in that case the user may just ignore the GCPs... But it is also acceptable to avoid presenting the GCPs on that case...

in reply to:  10 comment:11 by antonio, 13 years ago

Replying to alexmantaut:

Replying to antonio:

[CUT]

  • provide only geotransform if the product in in UTM or UPS projection (e.g. L1C and L1d products). No GCPs.
  • provide only GCPs in all other cases (including L0, L1A and L1B). No geotransform.

In the second case the user can alway compute an approx geotransform by itself using GCPs.

Do you think it is an acceptable solution?

Mmm, I totally agree with you on the matter of the Geotransform, but I don't see why I should not provide the GCPs for L1C and L1D, in that case the user may just ignore the GCPs... But it is also acceptable to avoid presenting the GCPs on that case...

From http://www.gdal.org/gdal_datamodel.html

Normally a dataset will contain either an affine geotransform, GCPs or neither.
It is uncommon to have both, and it is undefined which is authoritative.

Honestly I don't know if all tools using GDAL will work as expected is we provide both a geotransform and GCPs.

comment:12 by Even Rouault, 13 years ago

I concur with Antonio's point and what's said in the datamodel. For example, I'm not sure if gdalwarp would use the geotransform or the GCPs. That could be determined by code analysis of course, but that might be fragile if the code is later changed. Plus, if you use gdal_translate, it is likely that you will get either the GCPs or the geotransform depending on how the output driver is written

in reply to:  12 comment:13 by alexmantaut, 13 years ago

I didn't knew that, in that case I will provide GCPs just for L1B,L1A and L0 products.

by alexmantaut, 13 years ago

Attachment: hdf5_csk_111011.patch added

comment:14 by alexmantaut, 13 years ago

Antonio, sorry for the delay.

I just attached to the ticket a new version of the patch, with all the discused modifications.

The only modification from what was discused earlier is the HDF5ReadDoubleAttr() interface. The current interface is:

 CPLErr HDF5ReadDoubleAttr(const char* pszAttrName,double **pdfValues,int *nLen=NULL);

The method doesn't know how much memory to allocate for the array prior to the read, so I made the method to get the array's size, and allocate the memory within the method, and to retrieve the array's size on the nLen parameter (if supplied to the method).

Also I added some comments on existing methods, to make reading easier... Feel free to delete them if they're not appropiate or superfluous.

regards Alex

in reply to:  14 comment:15 by antonio, 13 years ago

Replying to alexmantaut:

Hi Alex, thanks for the new patch. I'm having some trouble trying to apply it to the current trunk. Can you please tell us which is the base revision for your patch?

comment:16 by alexmantaut, 13 years ago

Hi Antonio, Sorry, I just realized that I had some local modifications on the sources used to create the patch... I'm sending a new patch, this one was created with the code on the 1.8 branch.

by alexmantaut, 13 years ago

Attachment: hdf5_csk_131011.patch added

comment:17 by antonio, 13 years ago

Thanks Alex. I have just uploaded a new version of the patch submitted by Alex that has been modified to apply to the current trunk (r23238). Just tin case some of the developers wants to make a review.

I have also verified that it builds (some warning still to be fixed) and passes all tests for hdf5 that are currently in autotest.

I'm going to write some test specific for the new features.

by antonio, 13 years ago

Attachment: hdf5_csk_trunk_r23238.patch added

The same patch submitted by Alex but ported to trunk

in reply to:  17 comment:18 by alexmantaut, 13 years ago

I'm going to write some test specific for the new features.

Hi Antonio,

As you're writing specific tests, I'm sending you the unit tests I used while developing the patch... They're implemented with CppUnit, and I linked them against a cppunit static library... I don't know if they'll be very helpfull for the new tests, but feel free to take any code that is usefull to you... Also, all the images paths are hard coded, you will need to replace them with your own images...

Best regards

by alexmantaut, 13 years ago

Attachment: CSKTest.rar added

by antonio, 13 years ago

Test code for CSK specific functions

by antonio, 13 years ago

Attachment: CSK_DGM.h5 added

by antonio, 13 years ago

Attachment: CSK_GEC.h5 added

comment:19 by antonio, 13 years ago

Hi alex, I still didn't looked into your test code. Please find attached the test code and a small test dataset for CSK.

As you can see below there are two failures.

$ ./hdf5.py   TEST: hdf5_1 ... success
  TEST: hdf5_2 ... success
  TEST: hdf5_3 ... success
  TEST: hdf5_4 ... success
  TEST: hdf5_5 ... success
  TEST: hdf5_6 ... success
  TEST: hdf5_7 ... success
  TEST: hdf5_8 ... success
  TEST: hdf5_9 ... success
  TEST: hdf5_10 ... success
  TEST: hdf5_11 ... fail
    line 362: incorrect geo-transform: expected (275592.5, 2.5, 0, 4998152.5, 0, -2.5), found (275592.5, 4576.75, 0.0, 4998152.5, 0.0, -2089.375)
  TEST: hdf5_12 ... success
  TEST: hdf5_13 ... success
  TEST: hdf5_14 ... success
  TEST: hdf5_15 ... success
  TEST: hdf5_16 ... fail
    line 497: incorrect spatial reference detected for COSMO SkyMed GEC_B product: "None"
  TEST: : HDF5:"C1979091.h5"://HDF4_PALGROUP/HDF4_PALETTE_2 ... success
  TEST: : HDF5:"C1979091.h5"://Raster_Image_#0 ... success
  TEST: : HDF5:"half_moon_bay.grid"://HDFEOS/GRIDS/DEMGRID/Data_Fields/Elevation ... success

Test Script: hdf5
Succeeded: 17
Failed:    2 (0 blew exceptions)
Skipped:   0
Expected fail:0

Test hdf5_11 fails on the test dataset bu could work on a real one (failure depends on the fact that the test code uses annotated line/column spacing while the driver code computes if from other parameters; since the number of lines and columns is modified in the test tataset the computation gives a wrong result).

Test hdf5_16 detects the missing of GCP spatial reference in DGM products.

comment:20 by alexmantaut, 13 years ago

Summary: [PATCH] COSMO-SKYMED Georreference supportH

Hi Antonio,

I've just runned your test and saw the errors. About test 11, I'm calculating the geotransform from the image's corners ("Top Left East-North and /Bottom Right East-North) is it okay? or must I use the column and row spacing? On the images I've tried geotransform seemed to be okay, but please confirm if test 11 is a real error.

Regarding test 16, I didn't provide spatial reference information for DGM products, because previously there wasn't a way to get geotransform for those products... I think that now that GCPs are provided spatial reference may become important, so for what products shall I provide spatial reference? Shall I provide Spatial reference for every level?

Also I modified the CPL_CVSID on the trunk patch, because I wasn't able to apply it to my code...

regards

by alexmantaut, 13 years ago

Small fix to the trunk patch

in reply to:  20 comment:21 by antonio, 13 years ago

Replying to alexmantaut:

Hi Antonio,

I've just runned your test and saw the errors. About test 11, I'm calculating the geotransform from the image's corners ("Top Left East-North and /Bottom Right East-North) is it okay? or must I use the column and row spacing? On the images I've tried geotransform seemed to be okay, but please confirm if test 11 is a real error.

The test do not work because the test file contain original corners coordinates while the number of pixels/lines changes with respect to the original image, The test can't work as is. There are three options:

  • change the driver implementation to use (line/column spacing)
  • update the test dataset
  • change the test code

But please note that if you use a read CSK product to run test_11 it still fails because the geotransform computation is inaccurate.

Regarding test 16, I didn't provide spatial reference information for DGM products, because previously there wasn't a way to get geotransform for those products... I think that now that GCPs are provided spatial reference may become important, so for what products shall I provide spatial reference? Shall I provide Spatial reference for every level?

IMHO all products that have GCPs should have a valid pszGCPProjection.

Also I modified the CPL_CVSID on the trunk patch, because I wasn't able to apply it to my code...

regards

by alexmantaut, 12 years ago

new version of the patch, spatial reference and geotransform issues solved.

comment:22 by alexmantaut, 12 years ago

Antonio,

I just finished fixing the errors you detected. I've decided to change the way to calculate the geotransform to use Line/Column spacing, instead of calculating the matrix from the corners, this way the geotransform matrix is more accurate. I've runned all your tests and it succeds... Feel free to add some more in order to test the patch in dept.

Best regards Alex

Ps: This saturday (5/11) I will be travelling, I won't be available for modifications on the patch 'till the end of november.

comment:23 by etourigny, 12 years ago

Summary: Hadd support for georreference and geotransform info from COSMO-SKYMED files to HDF5 driver

comment:24 by antonio, 11 years ago

Milestone: 1.10.0
Resolution: fixed
Status: assignedclosed

IMO this is closed by r25775 (ported from sandbox r25749). Please Even update the release notes accordingly.

comment:25 by Even Rouault, 11 years ago

This is already in NEWS:

HDF5 driver:
 * Add support for COSMO-SkyMed metadata

comment:26 by antonio, 11 years ago

Yes, I just meant to add the ticket number like for other changes

comment:27 by Even Rouault, 11 years ago

ah ok. Done

comment:28 by Even Rouault, 11 years ago

r25797 "HDF5: avoid issuing error when opening QLK subdataset of Cosmo-SkyMed datasets; add tests for Cosmo-SkyMed datasets (#4160)"

comment:29 by Markus Neteler, 11 years ago

Cc: Markus Neteler added

I have made an attempt with 1.10.0 and trunk but gdalwarp still fails:

Dataset: ftp://cosmodemo:cosmodemo@ftp.e-geos.it/StripMap%20PING%20PONG%20-%20Bad%20Salzungen/CSKS1_GEC_B_PP_18_CV_RD_SF_20080204172548_20080204172551.h5

gdalwarp  HDF5:"CSKS1_GEC_B_PP_18_CV_RD_SF_20080204172548_20080204172551.h5"://S01/QLK bla.tif
ERROR 1: Unable to compute a transformation between pixel/line
and georeferenced coordinates for HDF5:CSKS1_GEC_B_PP_18_CV_RD_SF_20080204172548_20080204172551.h5://S01/QLK.
There is no affine transformation and no GCPs.

gdal-config --version
1.10.0

(-> indeed, it was today's trunk)

Any pointer welcome.

comment:30 by Even Rouault, 11 years ago

Resolution: fixed
Status: closedreopened

I've not read the spec, but from the sample attached to the ticket, it seems that the /QLK subdatasets have no georeferencing attached. The /SBI ones do have howevever. But trying those, I see that the reported SRS is incompletely defined and will not be useful as such. It has no GEOGCS node attached, Some code is missing in HDF5ImageDataset::CaptureCSKGeolocation(int iProductType)

comment:31 by antonio, 11 years ago

Sorry Eaven, it is my fault (see r25801)

comment:32 by Even Rouault, 11 years ago

Antonio, ah indeed, perhaps can you apply the fix in trunk and 1.10 then ?

comment:33 by antonio, 11 years ago

Resolution: fixed
Status: reopenedclosed

Done with r25997 and r25998.

comment:34 by Markus Neteler, 11 years ago

Milestone: 1.10.01.10.1

Excellent, thanks for the quick fix. I take liberty to update the Milestone since it will appear in 1.10.1.

Note: See TracTickets for help on using tickets.