Varnish + Zope – Multiple zope instances behind a single varnish cache

I run multiple Zope instances on one server. Each Zope instance listens on a different port (localhost:100xx). Historically I’ve just used Apache as a front end which forwards requests to the Zope instance.

Unfortunately there are periods of the year when one site gets a deluge of requests (for example; when hosting a school site, if it snows overnight, all the parents will check the site in the morning at around about 8am).

Zope is not particularly quick on it’s own – Apache’s “ab” reports that a dual core server with plenty of RAM can manage about 7-14 requests per second – which isn’t that many when you consider each page on a Plone site will have a large number of dependencies (css/js/png’s etc).

Varnish is a reverse HTTP proxy – meaning it sits in-front of the real web server, caching content.

So, as I’m using Debian Lenny….

  1. apt-get install -t lenny-backports varnish
  2. Edit /etc/varnish/default.vcl
  3. Edit Apache virtual hosts to route requests through varnish (rather than directly to Zope)
  4. I didn’t need to change /etc/default/varnish.

In my case there are a number of Zope instances on the same server, but I only wanted to have one instance of varnish running. This is possible – but it requires me to look at the URL requested to determine which Zope instance to route through to.

So, for example, SiteA runs on a Zope instance on localhost:10021/sites/sitea. My original Apache configuration would contain something like :

<IfModule mod_rewrite.c>
   RewriteEngine on
   RewriteRule ^/(.*)$1 [L,P]

To use varnish, I’ll firstly need to tell Varnish how to recognise requests for sitea (and other sites), so it can forward a cache miss to the right place, and then reconfigure Apache – so it sends requests into varnish and not directly to Zope.

So, firstly, in Varnish’s configuration (/etc/varnish/default.vcl), we need to define the different backend server’s we want varnish to proxy / cache. In my case they’re on the same server –

backend zope1 {
.host = "";
.port = "10021";
backend zope2 {
.host = "";
.port = "10022";
Then, in the 'sub vcl_recv' section, use logic like :
if ( req.url ~ "/sites/sitea/VirtualHostRoot") {
   set req.backend = zope1;
if ( req.url ~ "/siteb/VirtualHostRoot") {
    set req.backend = zope2;

With the above in place, I can now just tell Apache to rewrite Sitea to :

RewriteRule ^/(.*)$1 [L,P]

Instead….. and now we’ll find that our site is much quicker 🙂 (This assumes your varnish listens on localhost:6081).

There are a few additional snippets I found – in the vcl_fetch { … } block, I’ve told Varnish to always cache items for 30 seconds, and to also overwrite the default Server header given out by Apache etc, namely :

sub vcl_fetch {

    # ..... <snip> <snip>

    # force minimum ttl for objects

    if (obj.ttl < 30s) {

        set obj.ttl = 30s;


    # ... <snip> <snip>

    unset obj.http.Server;

    set obj.http.Server = "Apache/2 Varnish";

    return (deliver);

I'm happy anyway. :)
Use 'varnishlog', 'varnishtop' and 'varnishhist' to monitor varnish.


  1. I love Varnish to bits <3

    At the moment I am using HAproxy as a front-end and L7-capable load balancer for all the incoming HTTP traffic: it also terminates all the SSL connections for me.

    For sites which have caching enabled this then feeds into Varnish, which is there primarily to cache static content and stop those GET requests hitting the application servers which power the dynamic apps. Being able to script up different policies for various types of files is awesome.

    Finally you have Apache, memcached and all the "usual" gubbins in the back end.

    How much additional performance are you getting by using Varnish rather than just dishing the requests straight to Zope?

  2. Alex – native Zope was around 5-10 requests a second ( if I remember correctly ) – with varnish – about 200… which is enough 🙂

    The customers certainly notice the difference, and i wish i’d done it before.

  3. Hi All,

    I want to working varnish cache on single server and multiple domain on another server.
    please help me ASAP.


  4. In my case – yes – but obviously you don’t have to – varnish can act as the front end – which would give far better performance. In my case I needed to integrate it into an existing environment where Apache was in use on an IP address which was being used for a number of other websites.

  5. Rather, it is exactly what I need.
    I have many domains, in different instances zope, running as squid. Entertaining squid only allows one virtualhost with ssl, and scalability is less than the varnish.
    I am added as a rider and the varnish behind apache, which I intend to configure with ssl virtualhost.

    But i having many problems. Do you help me?

  6. run varnish; run apache;
    dzetaPool2 run zope;

    sub vcl_recv {
    if ( ~ “^varnish\.mydomain\.br” && req.url ~ “/brasilconectado/brasilconectado/VirtualHostRoot”) {
    set req.backend = dzetaPool2;


    SSLEngine on
    SSLCertificateFile /etc/apache2/ssl/
    SSLCertificateKeyFile /etc/apache2/ssl/

    RewriteEngine On
    RewriteRule ^/(.*){SERVER_NAME}:443/brasilconectado/brasilconectado/VirtualHostRoot/$1 [L,P]


  7. Ricardo – what problem(s) are you having? And is a header sent when you make an SSL connection? Have you tried running ‘varnishlog’ ?

  8. Hi,

    I wonder how you configure Plone to send the purge command to invalidate the contents at the corresponding backend?


Leave a comment

Your email address will not be published.