23
votes

I have my website configured to serve static content using gzip compression, like so:

<link rel='stylesheet' href='http://cdn-domain.com/css/style.css.gzip?ver=0.9' type='text/css' media='all' />

I don't see any website doing anything similar. So, the question is, what's wrong with this? Am I to expect shortcomings?

Precisely, as I understand it, most websites are configured to serve normal static files (.css, .js, etc) and gzipped content (.css.gz, .js.gz, etc) only if the request comes with a Accept-Encoding: gzip header. Why should they be doing this when all browsers support gzip just the same?

PS: I am not seeing any performance issues at all because all the static content is gzipped prior to uploading it to the CDN which then simply serves the gzipped files. Therefore, there's no stress/strain on my server.


Just in case it's helpful, here's the HTTP Response Header information for the gzipped CSS file:

Screenshot 1

And this for gzipped favicon.ico file:

Screenshot 2

1
PS: Possible duplicate, I know—but that question is years old, and the answers based around browser support aren't relevant anymore, IMHO. - its_me
Why wouldn't they be relevant any more? I still stand by my answer there: All modern browsers support receiving content as gzipped for transfer. How the content is pre/processed on the server side is irrelevant. You are doing something smart here, but it's not really a new question. As long as your server is still sending the right headers despite it being preprocessed you will be perfectly fine. Browsers may even save you these days even if you don't send all the right headers. - Matthew Scharley
For reference, many sites do actually do this but most won't keep the .gzip suffix (which should technically be only .gz). - Matthew Scharley
@MatthewScharley The thing is, I am thinking the issue is something more than browsers, the reason why I believe this is not widely adopted (and I don't know what exactly, it could be). Yes, all browsers today support gzip very, very well. So, no issues on the browser-end, or on mobile devices. PS Updated my question with some info. - its_me
Amazon is returning everything just fine, so you should be right. I can't think of any reason why any device wouldn't support Content-Encoding: gzip, it's a very highly used mechanism these days. As I already mentioned, preprocessing the files for gzip compression isn't a new idea (think svgz especially), but most hosts will be set up to handle it transparently without relying on a file name suffix as you've used. Are you actually having an issue? If so, could you highlight that? Currently I'm totally missing the question in your question. - Matthew Scharley

1 Answers

30
votes

Supporting Content-Encoding: gzip isn't a requirement of any current HTTP specification, that's why there is a trigger in the form of the request header.

In practice? If your audience is using a web browser and you are only worried about legitimate users then there is very, very slim to no chance that anyone will actually be affected by only having preprocessed gzipped versions available. It's a remnant of a bygone age. Browsers these days should handle being force-fed gzipped content even if they don't request it as long as you also provide them correct headers for the content being given to them. It's important to realise that HTTP request/response is a conversation and that most of the headers in a request are just that; a request. For the most part, the server on the other end is under no obligation to honor any particular headers, and as long as they return a valid response that makes sense the client on the other end should do their best to make sense of what was returned. This includes enabling gzip if the server responds that it has used it.

If your target is machine consumption however, then be a little wary. People still think that it's a smart idea to write their own HTTP/SMTP/etc parsers sometimes even though the topic has been done to death in multiple libraries for pretty much every language out there. All the libraries should support gzip just fine, but hand-rolled parsers usually won't.