Getting Started with Varnish Cache


· ·

If you are creating a web application and researching how to scale it, or trying to tune the performance of an existing application, then caching content is a great place to start. In terms of caching HTTP and HTTPS content, Varnish Cache is a tried-and true solution. This article covers its setup, and has some statistics on how much it can improve performance.


Varnish Cache is used by some of the biggest, highest-traffic sites on the Internet to optimize performance by caching key site components. It was built from the ground up to be a server-side caching solution and works well with even the most dynamically generated sites.

Why Should I Cache?

To put it simply, I/O (read/write) operations to disk drives are some of the slowest operations that modern servers can perform. While SSDs, or solid state drives, make things much more performant than traditional hard drives (also known as spinning disks or spindles), memory (RAM) is still the fastest way to retrieve things.

There is an entire industry of CDN (content delivery network) providers which specialize in knowing how to cache content from your application as close to the end user as possible, and they charge for that privilege. While using CDNs make sense in many situations, having some caching as part of your own deployment is always the best first step.

Performance Before nginx

For this purpose, I set up an Ubuntu 16.04 droplet on DigitalOcean, and followed a guide to install nginx + php. I then created an index.php file that takes just one second to generate a simulated content-heavy page with database connections and backend processing like on a WordPress or Drupal site.

Before Performance Graph:

Sample Varnish Configuration with nginx

Starting with the same droplet, Varnish Cache was installed using “apt install varnish”:

1) Validate that the default Varnish Configuration is porting to port 8080 configuration.

2) Update the Varnish Cache daemon to run on port 80 (default 6081).

3) Update nginx to run on port 8080 by changing two lines in the default server config.

4) Restart services (this works on systems like RHEL7+ and Ubuntu 16+).

5) It is now configured and running.

Performance After

While this isn’t the most complex example, it does prove that simply adding Varnish Cache to the mix can drastically improve performance. In this case, the average response time dropped by 95%.

After Performance Graph:


Performance optimization of the web and mobile applications that your business is built on is a critically important piece of retaining a happy customer base, and a solid way to keep employees engaged. Using proven technology that is market-tested and open source as a cornerstone of your performance strategy is a safe, strategic move, and Varnish Cache is the leading technology to support that decision.

Do you think you can beat this Sweet post?

If so, you may have what it takes to become a Sweetcode contributor... Learn More.

Vince Power is an Enterprise Architect at Medavie Blue Cross. His focus is on cloud adoption and technology planning in key areas like core computing (IaaS), identity and access management, application platforms (PaaS), and continuous delivery.


Click on a tab to select how you'd like to leave your comment

Leave a Comment

Your email address will not be published. Required fields are marked *