Nginx web server has this great module called HttpGzipStaticModule (http://wiki.nginx.org/HttpGzipStaticModule) - which does the following:
Before serving a file from disk to a gzip-enabled client, this module will look for a precompressed file in the same location that ends in ".gz". The purpose is to avoid compressing the same file each time it is requested.
So basically you get Nginx serving gzipped versions of your static assets, without the CPU overhead of deflating on each request. This also allows you to do maximum compression of resources, since they are "out of request".
I put together a small bash script to automate this process - will recursively walk a directory tree, compressing any CSS/JS file it finds to a .gz version with the same timestamp alongside (as required by HttpGzipStaticModule) and will also clean up any orphaned .gz files.