#89 closed task (fixed)
Improve caching of website
Reported by: | tmitchell | Owned by: | |
---|---|---|---|
Priority: | critical | Milestone: | |
Component: | SysAdmin | Keywords: | drupal cache performance |
Cc: |
Description
A few different people, in a few different contexts, have been asking about caching on the foundation website. For the record, here is what I've done:
- enabled Drupal's internal caching, with a 5 minute minimum lifetime
- enabled mysql caching, by setting in /etc/my.cnf, with 64MB RAM
Others have asked about squid caching, but I'm ignorant of this, so passing to SAC for others to comment. Setting this to minor now that these two methods are both working.
Change History (9)
comment:1 by , 18 years ago
comment:2 by , 18 years ago
It is also noted that Drupal may not be handling 404 requests nicely... since there are many dead links from old site structure, it is recommended we modify .htaccess to handle things better:
and as clousau on IRC mentions:
clouseau: also, this was added to .htaccess in Drupal 5, but will work with 4.7: [08:44am] clouseau: # Requires mod_expires to be enabled. [08:44am] clouseau: <IfModule mod_expires.c> [08:44am] clouseau: # Enable expirations. [08:44am] clouseau: ExpiresActive On [08:44am] clouseau: # Cache all files for 2 weeks after access (A). [08:44am] clouseau: ExpiresDefault A1209600 [08:44am] clouseau: # Do not cache dynamically generated pages. [08:44am] clouseau: ExpiresByType text/html A1 [08:44am] clouseau: </IfModule> [08:44am] clouseau: that helps graphically-heavy sites more
- Upgrading to Drupal 5 will help in some areas too
comment:3 by , 18 years ago
Both of those changes to htaccess make a lot of sense. I'd even extend caching of text/html content, but I'd be surprised if content generated dynamically by PHP was covered by mod_expires anyway.
I could be wrong though; all of my experience thus far has been with Apache 1.x, and some of the module call orders seem to have changed considerably in 2.x
comment:4 by , 18 years ago
Owner: | changed from | to
---|---|
Priority: | minor → critical |
Type: | enhancement → task |
Please review the above, but at the least we need to be calling the cron.php script (i.e. using wget) regularly. It will help optimise drupal as well as rebuild search indexes, etc. Let's try to very 2 hours to start with.
comment:5 by , 18 years ago
Owner: | changed from | to
---|
I have added a job in /etc/cron.d/backup.cron to invoke /root/scripts/drupal_cron.sh which does a quiet wget against http://www.osgeo.org/cron.php and then cleans up.
I doubt I'll be doing the rest of the stuff myself, so I'm reassigning back to sac.
comment:7 by , 18 years ago
Tyler,
In my opinion, load on the peer1 system is tiny now, and there is no real need to push further on cashing. As we learned, it wasn't Drupal that was our performance bottleneck.
I'd encourage closing this ticket.
comment:8 by , 18 years ago
Resolution: | → fixed |
---|---|
Status: | new → closed |
Closing as there is no longer any real need to improve performance. Load is very modest.
comment:9 by , 17 years ago
I found the cron.php has been failing. I fixed it by increasing memory to 32MB in php.ini. I was sure I increased it before, but it was back to the default 16MB. I also "repaired" the search index tables.
See wiki for more info: http://wiki.osgeo.org/index.php/OSGeo_Portal_Site#Troubleshooting
Other comments received include:
Can someone please set up a cron job that calls http://oursite.com/cron.php and also that runs above script against db?