Nginx HTTP and HTTPS Default Server
Nginx is rad. I use it for all my webhosting needs. It has proven itself to be
extremely stable, secure and reliable. What Nginx isn’t as well prepared to
deal with is setting up virtual hosts. If you serve multiple sites from one IP
address, simply entering the server IP into a web browser will lead you
straight to the first matching host setup in your nginx.conf
. Chances are
that this is not what you want.
Add the following lines as the first server
block to your nginx.conf
to
remedy this situation:
server {
listen 1.2.3.4:80 default_server;
listen [::]:80 default_server;
server_name _;
server_name_in_redirect off;
log_not_found off;
return 410;
}
Reload Nginx, done.
“Great”, you say, “but what about HTTPS?”
Well, it’s possible but due to the nature of SSL/TLS, you cannot simply redirect all HTTPS traffic to HTTP. A successful SSL connection has to be established before doing anything else, otherwise everything breaks. To help lubricate the gears of this mechanism, you need some snake oil.
Create a self-signed certificate:
openssl req -newkey rsa:2048 -nodes -keyout snakeoil-key.pem -x509 -days 3650 -out snakeoil-certificate.pem
No need to fill in anything, you can just press Return until you’re back at the
prompt. Then move those files somewhere you feel comfortable having them. On my
FreeBSD 11 machine, I chose /usr/local/www/default_site
.
Now add the following to your nginx.conf
:
server {
listen 1.2.3.4:443 ssl http2 default_server;
listen [::]:443 ssl http2 default_server;
server_name _;
ssl_certificate /usr/local/www/default_server/snakeoil-certificate.pem;
ssl_certificate_key /usr/local/www/default_server/snakeoil-key.pem;
## To ensure old/ancient versions of Nginx to be secure and not fall victim
## to SSLv3 vulnerabilities, uncomment the following lines:
#
# ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
# ssl_ciphers HIGH:!aNULL:!MD5;
server_name_in_redirect off;
log_not_found off;
return 410;
}
Once you reload, both HTTP and HTTPS IPs will just return a 410 Gone
status,
instructing bots and search engines to remove this URL or never index it in the
first place, also preventing nosy visitors from checking out what other sites
you host.