1
votes

We've currently setup GZIP compression using htaccess deflate command. I wondered if anybody would kindly help us understand the following...

Are there any potential problems using htaccess to deflate, like additional server strain in deflating?, And is this suitable for a site with 1,200 daily page views which pulls in several JS / CSS files?

We've considered hosting GZIP files alongside our content and creating a script to update zip files should the unzipped file change. However a simple dump of the following code seems to be much easier providing it doesn't bring its own issues...

# compress text, html, javascript, css, xml:
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/xml
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/x-javascript
AddOutputFilterByType DEFLATE font/ttf font/otf

Thanks in advance for any advice.

Dave

2

2 Answers

0
votes

It really dependends on what your are deflating. If you have a good CPU the perfomance hit for text based files(js,css etc) is almost nothing. I would also include mod_expires for cache control of static files. If you are caching large dynamic files then you might run into a performance hit. However the text files that you are caching in your current rules should not really have a large impact.

So yes you should be caching your files for a good user experience which you can read about here.

https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/optimize-encoding-and-transfer

And depending on what you are servering you can find out some performance info on mod_deflate here.

http://www.webperformance.com/library/reports/moddeflate/

I personally use mod_deflate on my static content and my sites load very fast. I don't think you will have an issue with what you are currently using it for.

0
votes

Let's say however that you have 20 external JS and 10 CSS files embedded in the same page (as many sites do these day), making it a total of 31 files (including the .html file itself), and they averaged say 20KBs each. I'm guessing that would equate to 31 processes on the server side to compress all files, and 31 process on the client side to deflate all files (someone please confirm).

With 1200 requests a day, that'd equate to 37,200 process on the server side. I'd be looking at CPU usage for each process. On a busy server you may run into trouble. For static ASCII files like CSS, HTML and JS it may be better to retain a copy of the gizps and use something like this in your .htaccess file :

<FilesMatch "\.css\.gz$">
   ForceType text/css
   AddEncoding gzip .gz
</FilesMatch>

<FilesMatch "\.html\.gz$">
    ForceType text/html
    AddEncoding gzip .gz
</FilesMatch>

RewriteEngine On
RewriteCond %{HTTP:Accept-Encoding} gzip
RewriteCond %{REQUEST_FILENAME}.gz -f
RewriteRule (.*\.(html|css))$ $1.gz [L]

Which would serve the existing gzip, rather than creating thousands of gzips on the fly every day. You'd have to create the gzips using this method though.