If you are creating a web application and researching how to scale it, or trying to tune the performance of an existing application, then caching content is a great place to start. In terms of caching HTTP and HTTPS content, Varnish Cache is a tried-and true solution. This article covers its setup, and has some statistics on how much it can improve performance.
Varnish Cache is used by some of the biggest, highest-traffic sites on the Internet to optimize performance by caching key site components. It was built from the ground up to be a server-side caching solution and works well with even the most dynamically generated sites.
Why Should I Cache?
To put it simply, I/O (read/write) operations to disk drives are some of the slowest operations that modern servers can perform. While SSDs, or solid state drives, make things much more performant than traditional hard drives (also known as spinning disks or spindles), memory (RAM) is still the fastest way to retrieve things.
There is an entire industry of CDN (content delivery network) providers which specialize in knowing how to cache content from your application as close to the end user as possible, and they charge for that privilege. While using CDNs make sense in many situations, having some caching as part of your own deployment is always the best first step.
Performance Before nginx
For this purpose, I set up an Ubuntu 16.04 droplet on DigitalOcean, and followed a guide to install nginx + php. I then created an index.php file that takes just one second to generate a simulated content-heavy page with database connections and backend processing like on a WordPress or Drupal site.
# vi /var/www/html/index.php
Nothing to see here...