Something I wasn't aware of until recently, the Nginx FastCGI module comes with a rather nifty caching component.
Basically via config you are able to control the caching of responses from your FastCGI backend(s) (e.g. PHP-FPM) straight to disk, with the next request for the same cache key (which is typically made up of your
$request_uri - but can be controlled) then served up from disk cache via Nginx directly.
I have posted some boilerplate config to a Gist here:
Things of note:
- The target cache file path, shared memory size/use and GC timeout is set by
- Cache key is controlled via
fastcgi_cache_key and can include anything in it's makeup that's available in Nginx variable land.
- Using a rewrite variable
$fastcgi_skipcache and some if() conditions we can get quite complex as to what is in/out when it comes to adding content to the cache vs. passing through to your FastCGI backend.
Have had some really good success with this on a few highly trafficked, yet sluggish WordPress blogs of late with minimal code changes required within the apps themselves - including the use of custom HTTP headers with XMLHTTP requests and
$http_[CUSTOM-HEADER] to control cache injection.
Well worth a look in.