I've been using Serendipity for a couple of years and really love it now. But sometimes I wish it could be faster. Different servers tell a difference, but an average generation time for example for one category listing with 15 entries takes about 5 seconds or more. So I couldn't wait any longer, since I've learned Varnish Cache, to toggle it between Serendipity and the end user. Of course, Varnish would limit some blogging features, but the benefits will be worth of that. Sounds interesting? So this article is for you.
First of all, we need to make our web server to listen on some other port that 80. Additionally, the webserver should listen on localhost instead of external address, so then uncached contents are not visible to the world. This is good for SEO, as there will be no duplicate content on the net, as well as a security mesure, see below. Lets take a look at a sample apache host config:
INI:
<VirtualHost 127.0.0.1:8080>
ServerAdmin webmaster@example.com
ServerName example.com
AccessFileName .htaccess
RewriteEngine On
DocumentRoot /var/www/example.com/htdocs/
ErrorLog /var/log/apache2/example.com.error.log
CustomLog /var/log/apache2/example.com.access.log combined
ServerSignature On
</VirtualHost>
For this to work you'll need to set the Listen directive correspondingly to 8080. Note, if you have multiple hosts hanging on the same web server, there would be probably no way to let some other hosts run on port 80, as this port will be already taken by varnish. For now I'll just assume only one domain running on the server. Now, as the apache config is there, it's time for varnish:
INI:
backend default {
.host = "127.0.0.1";
.port = "8080";
}
sub vcl_recv {
if (req.url ~ "^(/serendipity_admin.php|/comment.php|/plugin/captcha|/delete/).*") {
return (pass);
}
if (!(req.url ~ "^(/serendipity_admin.php|/delete/).*")) {
unset req.http.cookie;
}
set req.grace = 30s;
}
sub vcl_fetch {
if (!(req.url ~ "^(/serendipity_admin.php|/comment.php|/plugin/captcha|/delete/).*")) {
unset beresp.http.set-cookie;
unset beresp.http.expires;
unset beresp.http.cache-control;
unset beresp.http.pragma;
unset beresp.http.last-modified;
set beresp.ttl = 86000s;
}
if (req.url ~ "^(/serendipity_admin.php|/comment.php|/plugin/captcha|/delete/).*") {
set beresp.ttl = 0s;
}
if (req.url ~ ".*\.(css|js|png|jpg|gif)$") {
set beresp.ttl = 2764800s;
set beresp.http.cache-control = "max-age=2764800, public";
set beresp.http.last-modified = "Wed, 15 Nov 1995 04:58:08 GMT";
}
set beresp.grace = 30s;
}
As you see, I define a backend for the port and host apache is listening on. In vcl_recv request client sent cookies are deleted letting contents to be cacheable. For the URIs supposed to be not cacheable, immediate return happens.
In vcl_fetch the important step is to delete all the response HTTP headers, which could affect or dusrupt caching. Setting the object TTL manually for one day or even more for CCS, JS or image files.
In both subs we tell varnish cache to deliver also expired objects under some circmstances. For example when another thread already does an unfinished request for the same object.
For the case there are more hosts hanging on the web server, which you would like not to cache, just add a match at the top of our subs respectively:
INI:
sub vcl_recv {
if (req.http.host ~ "wont-cache-this.com") {
return(pass);
}
.............................
}
sub vcl_fetch {
if (req.http.host ~ "wont-cache-this.de") {
return(hit_for_pass);
}
..............................
}
This hosts must of course also listen on the same host and port your cacheable host does. Anyways varnish will pass all the stuff on them without caching.
Wow, what's now? My comments are not shown, after sombody writes it, but auto moderation is active. Nobody can even post a comment. Allrightly! Probably the blog still has the spam protector activated, which produces captchas and has the CRLF antispam option on. The CRLF protection will always check for a session token, which would be wrong or lack, because an article page would be normally cached and cookies not passed to both backend origin and end client. I personally have decided to turn off CRLF, as the comments will in no case appear on a cached article page, and to rely more on the akismet antispam and ip blocking. In the meantime I can moderate spam comments. There is more like no way to get CRLF worky with the varnish.
This is just a simple solution about how to speed up the serendipity blog. It includes some benefits and some losses. Anyway, the page generation time for my blog amounts to about 200ms instead of 3-5s. I found this amazing
For better cache handling there should be a serendipity plugin, which would reset cache in usual cases like
- an article was edited
- category was edited
- comment was added
- etc.
If I had more time, I would write such a plugin. May be I'll do that some somewhen, if there is no already one

But for now, the step you probably will need after such content change action, would be to trigger the cache update manually. This would consist on two steps, first invalidate all the site cache
BASH:
root@host:~$ varnishadm
varnish> ban req.http.host ~ "example.com"
200
and then update it
BASH:
user@host:~$ cd /tmp && /usr/bin/wget -m -w 12 -A '*.html','*.css','*.js' --delete-after http://example.cmo
That's all now, best regards there ...