Opened 13 years ago
Last modified 12 years ago
#896 new defect
Loading ISO19135 sample records slows Geonetwork-ANZMEST dramatically
Reported by: | awalsh | Owned by: | |
---|---|---|---|
Priority: | major | Milestone: | v2.8.0 |
Component: | General | Version: | |
Keywords: | performance, ISO19135 | Cc: | simon.pigot@… |
Description
I built GN trunk SVN9041 + ANZMEST schemas from trunk and tested loading of ISO19135 records.
Noted that GN became VERY SLOW after I had added the sample ISO19135 records. CPU load became v. high (50%) on displaying 'home' page and basic search (all results) , java.exe memory use jumped from 200mb to 645mb. The request /geonetwork/srv/eng/main.home took 17 mins to finish! and reload the home page.
I got 'Java heap space' error from basic search request /geonetwork/srv/eng/main.search.embedded as follows:
Search Error × HTTP ERROR 500
Problem accessing /geonetwork/srv/eng/main.search.embedded. Reason:
Java heap space
Caused by:
java.lang.OutOfMemoryError: Java heap space at java.util.Arrays.copyOf(Arrays.java:2882) at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100) at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:390) at java.lang.StringBuffer.append(StringBuffer.java:224) at java.io.StringWriter.write(StringWriter.java:84) at org.jdom.output.XMLOutputter.indent(XMLOutputter.java:1217) .... .... Powered by Jetty://
FYI Maximum heap memory was set to 512m and java->Jetty start command was: java -Xms48m -Xmx512m -Xss2M -XX:MaxPermSize=128m -Dmime-mappings=..\web\geonetwork\WEB-INF\mime-types.properties -DSTOP.PORT=8079 -Djava.awt.headless=true -DSTOP.KEY=geonetwork -jar start.jar
Further info:
Hardware Intel core 2 duo CPU 3.33 GHZ 3.5 GB RAM Op. System Windows XP Java 1.6_07 Jetty
Change History (5)
comment:1 by , 13 years ago
comment:2 by , 13 years ago
I'll leave this open until you get a chance to run later revisions (9104) of GeoNetwork with the recommended approach to large register records.
comment:3 by , 13 years ago
Or I should say until the extract-subtemplates proposal and documentation describing how to edit large register records is committed which I hope will be svn rev 9104 (the next rev :-))
comment:4 by , 13 years ago
Thanks Simon, makes sense. Will try a more recent rev. and increase my heap space.
comment:5 by , 12 years ago
Milestone: | v2.7.0 → v2.8.0 |
---|
Andrew - if you run short of memory everything runs slowly as there is often a lot of time spent trying to retrieve memory in the java virtual machine so the sensational time of 17 mins to load the home page isn't really useful.
The memory requirements here are large because these records can be quite large and GeoNetwork editor tries to create an html form that includes the whole record.
Later revs (you should wait until my proposal on subtemplate extraction is hopefully approved) step around this issue by:
It's a coming thing that records will become larger so the strategy of editing large records in a single html form needs help which is why other strategies are being and need to be developed.
Also, on a 3.5Gb RAM machine, you might be a little more generous with what you allocate to GeoNetwork - how about 1.5Gb (-Xmx1.5g) at least? :)