2
votes

In nearly every example I've seen for setting up Varnish with nginx and SSL support, the setup is Varnish running on port 80, nginx on port 443 for SSL termination and nginx running on another port doing the actual work communicating with the backend.

Given most websites now redirect port 80 to 443, what advantage is there in having Varnish running on port 80?

Why wouldn't you have nginx running on port 80, doing the 301 to the HTTPS version, nginx running on port 443 doing the SSL termination and proxying to Varnish, which is running on a different port, with nginx again running on another port doing the actual work?

HTTP: nginx [80] (301)

HTTPS: nginx [443] <> Varnish [6081] <> nginx [8080] <> backend

I really can't see any merit in having Varnish on port 80 front of house just to do a redirect. Unless, there's some problem with redirects and the unwanted addition of port numbers to URLs? Maybe adding 3 nginx server blocks is adding "more" work to the setup, but then having to configure Varnish to redirect port 80, unless it's internal, seems like "more" work.

Bonus question: Why is Apache added to the mix in most of these setups when nginx is already in use and visa-versa? They can both handle SSL termination and proxying.

2

2 Answers

1
votes

I agree with "why not":

HTTP: nginx [80] (301)

HTTPS: nginx [443] <> Varnish [6081] <> nginx [8080] <> backend

As to why:

HTTP: Varnish [80] (conditional 301, using VCL)

HTTPS: nginx [443] <> Varnish [80] <> nginx [8080] <> backend

The answer is:

  • legacy reasons. This is just the way to go in "conditional HTTPs" world (where it is OK to have a website work in both HTTP and HTTPs versions or no HTTPs version at all), which was just a couple years ago, before Google, as web monopolist, did not insist on all websites having HTTPs or fear poor-er search rankings. It is relatively recently, that LetsEncrypt allowed everyone to avail of free certificates, and the aforementioned requirement from Google made so many websites use those. The websites/tutorials for Varnish setup, simply did not pick up / adjust ports as something that doesn't strike as being needed to be adjusted.
  • expandability. Think outside the "single server" setup. When you decide to build a stack of Varnish-es (CDN), it makes much more sense to keep the "main" Varnish on port 80. (Outside/edge Varnish instances will be talking to the main Varnish,as opposed to talking to main backend, for "cache of cache" sort of thing). The traffic between edge<>main wouldn't be secure but have no performance penalty of encryption.
0
votes

I think we can simplify a bit: HTTPS: nginx [443] <> Varnish [6081]<> backend

Let Varnish do the caching and avoid the extra Nginx layer.

More simplification: hitch [443] <> Varnish [6081]<> backend

Hitch: https://hitch-tls.org/