Version 11 (modified by 16 years ago) ( diff ) | ,
---|
Proposal number : ? Proposal title : GZIP content compression filter to reduce network traffic
Date | 2008/07/21 |
Contact(s) | Simon Pigot |
Last edited | Timestamp |
Status | draft, being discussed, complete |
Assigned to release | 2.4.0 |
Resources | Test implementation is complete |
Overview
Compression reduces the size of HTTP responses and compressed content is now understood by many browsers (its been around since HTTP/1.1) including firefox 1+, IE5+, etc. This can speed up the time it takes for pages with lots of content to refresh and draw. Most developers don't realize this because they usually test using localhost. Responses sent by GN will only grow larger as both the metadata standards and the content they contain grow richer and the user interface becomes more complex (read more JavaScript). As an example, a large iso19139 metadata record with a few sizeable select/pulldown lists took half the time it used to take to render when the compression filter was added.
Proposal Type
- Type: Servlet configuration, configuration change
- App: GeoNetwork and Intermap
- Module: servlet configuration (web.xml files)
Links
- Documents: Two servlet filters every web app should have and for a discussion of http compression and browser support
- This has been available for some time in the BlueNET MEST - no reported issues.
Voting History
- Not voted on yet.
Motivations
Compression of HTTP responses from GeoNetwork will speed up page refreshes and reduce the network footprint required to run GeoNetwork - see overview above.
Proposal
A GZIP compression filter has been written for general use by the people at jspbook.com - Two servlet filters every web app should have - this filter has been adapted for use in GeoNetwork and some issues that caused trouble in Internet Exploder has been fixed. The filter will only be applied to requests that can accept gzip compression and to requests that are not asking for an image (trapped using suffixes) or for a resource such as a thumbnail or download file as all of these are almost always compressed anyway.
One other function that may prove to be useful has been added to the filter. This function sets the expiry date on certain types of content (javascript, css, locale images) into the far future. The idea here is that this content will not be downloaded each time a user returns to a previously loaded page - instead this content will be found either in the web browser or proxy cache. The down side with this feature (apart from whether it really saves all that much anyway) is that the URLs of the static content need to change (eg. by adding ?version=whatever to javascript tags) when a new version of GeoNetwork is released to avoid stale static content being retrieved from the cache. This is not implemented in GeoNetwork and whether it is worth it will depend upon further testing of the worth of this feature.
Steps needed to insert compression filter in servlet path. This involves modifications to web/geonetwork/WEB-INF/web.xml and web/intermap/WEB-INF/web.xml as follows:
<!-- modified version of jspbook code to support selective compression, and fix bugs that broke IE - see GZIPFilter.jar --> <filter> <filter-name>GZIPCompressor</filter-name> <filter-class>com.jspbook.GZIPFilter</filter-class> <init-param> <param-name>urls-to-expire-forward</param-name> <param-value>/geonetwork/scripts,/geonetwork/images,/geonetwork/loc</param-value> </init-param> <init-param> <param-name>do-not-gzip-these-urls</param-name> <param-value>.gif,.jpg,.png,.tif,.bmp,.zip,.gzip</param-value> </init-param> </filter> <filter-mapping> <filter-name>GZIPCompressor</filter-name> <url-pattern>/*</url-pattern> </filter-mapping>
and finally the inclusion of a new jar: GZIPFilter.jar in web/geonetwork/WEB-INF/lib and web/intermap/WEB-INF/lib.
Backwards Compatibility Issues
Some allusion is made in various places on the net to early versions of browsers not properly supporting compressed content - see for a discussion of http compression and browser support. The short answer is that the browsers we want to support (IE 7+, Firefox 2+ and others) all support compressed content without issues BUT there have been reports of gzip compression causing confusion on Windows Vista. If Vista is an issue then perhaps Vista clients can be trapped by examining http headers and switching off gzip compression for such requests or alternatively sites can disable the filter altogether by commenting it out in the web.xml file. No reports of this problem have been made by BlueNet users as yet though and this has been in use for approx 6 months.
Risks
Vista usage may want more testing.
Participants
- As above